Oct 14 13:01:00 crc systemd[1]: Starting Kubernetes Kubelet... Oct 14 13:01:01 crc restorecon[4668]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 14 13:01:01 crc restorecon[4668]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 14 13:01:02 crc kubenswrapper[4837]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:01:02 crc kubenswrapper[4837]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 14 13:01:02 crc kubenswrapper[4837]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:01:02 crc kubenswrapper[4837]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:01:02 crc kubenswrapper[4837]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 14 13:01:02 crc kubenswrapper[4837]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.504592 4837 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511548 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511584 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511593 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511603 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511611 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511620 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511632 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511643 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511652 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511661 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511669 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511679 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511687 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511694 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511702 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511710 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511717 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511725 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511733 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511740 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511748 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511763 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511772 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511779 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511787 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511795 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511802 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511811 4837 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511821 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511829 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511838 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511846 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511854 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511861 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511869 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511881 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511891 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511900 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511908 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511917 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511924 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511932 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511941 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511949 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511958 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511966 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511974 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511981 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.511990 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512000 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512009 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512018 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512028 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512037 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512045 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512053 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512061 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512069 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512076 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512084 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512092 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512099 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512107 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512114 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512125 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512133 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512140 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512150 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512186 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512195 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.512203 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512368 4837 flags.go:64] FLAG: --address="0.0.0.0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512384 4837 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512398 4837 flags.go:64] FLAG: --anonymous-auth="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512411 4837 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512424 4837 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512434 4837 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512447 4837 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512458 4837 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512468 4837 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512478 4837 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512487 4837 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512497 4837 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512506 4837 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512515 4837 flags.go:64] FLAG: --cgroup-root="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512524 4837 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512533 4837 flags.go:64] FLAG: --client-ca-file="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512542 4837 flags.go:64] FLAG: --cloud-config="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512550 4837 flags.go:64] FLAG: --cloud-provider="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512559 4837 flags.go:64] FLAG: --cluster-dns="[]" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512570 4837 flags.go:64] FLAG: --cluster-domain="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512579 4837 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512588 4837 flags.go:64] FLAG: --config-dir="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512597 4837 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512607 4837 flags.go:64] FLAG: --container-log-max-files="5" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512618 4837 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512627 4837 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512636 4837 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512645 4837 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512654 4837 flags.go:64] FLAG: --contention-profiling="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512663 4837 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512672 4837 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512681 4837 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512690 4837 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512701 4837 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512710 4837 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512720 4837 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512729 4837 flags.go:64] FLAG: --enable-load-reader="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512738 4837 flags.go:64] FLAG: --enable-server="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512747 4837 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512758 4837 flags.go:64] FLAG: --event-burst="100" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512769 4837 flags.go:64] FLAG: --event-qps="50" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512777 4837 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512786 4837 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512795 4837 flags.go:64] FLAG: --eviction-hard="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512806 4837 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512815 4837 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512824 4837 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512833 4837 flags.go:64] FLAG: --eviction-soft="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512842 4837 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512851 4837 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512861 4837 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512871 4837 flags.go:64] FLAG: --experimental-mounter-path="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512880 4837 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512889 4837 flags.go:64] FLAG: --fail-swap-on="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512898 4837 flags.go:64] FLAG: --feature-gates="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512909 4837 flags.go:64] FLAG: --file-check-frequency="20s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512918 4837 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512928 4837 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512937 4837 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512946 4837 flags.go:64] FLAG: --healthz-port="10248" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512955 4837 flags.go:64] FLAG: --help="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512965 4837 flags.go:64] FLAG: --hostname-override="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512974 4837 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512983 4837 flags.go:64] FLAG: --http-check-frequency="20s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.512994 4837 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513003 4837 flags.go:64] FLAG: --image-credential-provider-config="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513012 4837 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513021 4837 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513029 4837 flags.go:64] FLAG: --image-service-endpoint="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513038 4837 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513047 4837 flags.go:64] FLAG: --kube-api-burst="100" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513056 4837 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513066 4837 flags.go:64] FLAG: --kube-api-qps="50" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513074 4837 flags.go:64] FLAG: --kube-reserved="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513084 4837 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513092 4837 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513104 4837 flags.go:64] FLAG: --kubelet-cgroups="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513113 4837 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513122 4837 flags.go:64] FLAG: --lock-file="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513130 4837 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513139 4837 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513149 4837 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513187 4837 flags.go:64] FLAG: --log-json-split-stream="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513196 4837 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513205 4837 flags.go:64] FLAG: --log-text-split-stream="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513214 4837 flags.go:64] FLAG: --logging-format="text" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513223 4837 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513233 4837 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513242 4837 flags.go:64] FLAG: --manifest-url="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513250 4837 flags.go:64] FLAG: --manifest-url-header="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513262 4837 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513271 4837 flags.go:64] FLAG: --max-open-files="1000000" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513283 4837 flags.go:64] FLAG: --max-pods="110" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513292 4837 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513301 4837 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513310 4837 flags.go:64] FLAG: --memory-manager-policy="None" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513319 4837 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513329 4837 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513338 4837 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513347 4837 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513366 4837 flags.go:64] FLAG: --node-status-max-images="50" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513375 4837 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513385 4837 flags.go:64] FLAG: --oom-score-adj="-999" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513394 4837 flags.go:64] FLAG: --pod-cidr="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513403 4837 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513417 4837 flags.go:64] FLAG: --pod-manifest-path="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513426 4837 flags.go:64] FLAG: --pod-max-pids="-1" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513435 4837 flags.go:64] FLAG: --pods-per-core="0" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513444 4837 flags.go:64] FLAG: --port="10250" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513453 4837 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513462 4837 flags.go:64] FLAG: --provider-id="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513471 4837 flags.go:64] FLAG: --qos-reserved="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513481 4837 flags.go:64] FLAG: --read-only-port="10255" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513490 4837 flags.go:64] FLAG: --register-node="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513499 4837 flags.go:64] FLAG: --register-schedulable="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513508 4837 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513522 4837 flags.go:64] FLAG: --registry-burst="10" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513531 4837 flags.go:64] FLAG: --registry-qps="5" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513540 4837 flags.go:64] FLAG: --reserved-cpus="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513549 4837 flags.go:64] FLAG: --reserved-memory="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513560 4837 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513569 4837 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513578 4837 flags.go:64] FLAG: --rotate-certificates="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513588 4837 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513597 4837 flags.go:64] FLAG: --runonce="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513606 4837 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513615 4837 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513624 4837 flags.go:64] FLAG: --seccomp-default="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513652 4837 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513662 4837 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513671 4837 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513680 4837 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513689 4837 flags.go:64] FLAG: --storage-driver-password="root" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513698 4837 flags.go:64] FLAG: --storage-driver-secure="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513707 4837 flags.go:64] FLAG: --storage-driver-table="stats" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513716 4837 flags.go:64] FLAG: --storage-driver-user="root" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513725 4837 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513734 4837 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513743 4837 flags.go:64] FLAG: --system-cgroups="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513752 4837 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513765 4837 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513774 4837 flags.go:64] FLAG: --tls-cert-file="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513782 4837 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513794 4837 flags.go:64] FLAG: --tls-min-version="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513803 4837 flags.go:64] FLAG: --tls-private-key-file="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513812 4837 flags.go:64] FLAG: --topology-manager-policy="none" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513821 4837 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513830 4837 flags.go:64] FLAG: --topology-manager-scope="container" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513841 4837 flags.go:64] FLAG: --v="2" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513853 4837 flags.go:64] FLAG: --version="false" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513864 4837 flags.go:64] FLAG: --vmodule="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513875 4837 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.513885 4837 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514094 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514107 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514117 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514126 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514134 4837 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514142 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514151 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514184 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514201 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514210 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514217 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514225 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514233 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514241 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514249 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514257 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514264 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514272 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514280 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514288 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514295 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514303 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514311 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514318 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514326 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514334 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514342 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514352 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514361 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514368 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514376 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514415 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514427 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514437 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514451 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514461 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514470 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514480 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514492 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514504 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514520 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514531 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514541 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514549 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514558 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514567 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514574 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514582 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514590 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514598 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514605 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514613 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514621 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514629 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514636 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514644 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514651 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514659 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514666 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514674 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514682 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514689 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514697 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514704 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514712 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514720 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514730 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514746 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514754 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514765 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.514776 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.514799 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.526319 4837 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.526351 4837 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526450 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526462 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526470 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526478 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526485 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526494 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526502 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526510 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526518 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526525 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526531 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526537 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526543 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526550 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526557 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526563 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526569 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526576 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526585 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526591 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526597 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526602 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526607 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526612 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526617 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526623 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526628 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526633 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526638 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526643 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526648 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526653 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526659 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526664 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526671 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526677 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526682 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526688 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526693 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526698 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526705 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526712 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526717 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526722 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526727 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526732 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526739 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526746 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526752 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526757 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526763 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526768 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526773 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526778 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526783 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526788 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526794 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526801 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526806 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526812 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526818 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526823 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526828 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526833 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526838 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526843 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526848 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526853 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526858 4837 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526863 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.526869 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.526878 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527060 4837 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527079 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527087 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527093 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527099 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527105 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527111 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527116 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527121 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527128 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527133 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527138 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527143 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527148 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527153 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527189 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527199 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527206 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527212 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527220 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527229 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527237 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527243 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527248 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527253 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527259 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527263 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527268 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527273 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527280 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527287 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527292 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527299 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527304 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527319 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527333 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527344 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527350 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527358 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527365 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527372 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527378 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527383 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527388 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527393 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527398 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527403 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527408 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527413 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527418 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527422 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527427 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527432 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527437 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527442 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527446 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527451 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527456 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527461 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527466 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527471 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527477 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527483 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527488 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527495 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527501 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527507 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527512 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527517 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527523 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.527530 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.527538 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.527712 4837 server.go:940] "Client rotation is on, will bootstrap in background" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.532274 4837 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.532387 4837 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.534294 4837 server.go:997] "Starting client certificate rotation" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.534362 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.534538 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-19 13:57:12.951149827 +0000 UTC Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.534693 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1584h56m10.416462534s for next certificate rotation Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.567536 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.570461 4837 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.595294 4837 log.go:25] "Validated CRI v1 runtime API" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.633293 4837 log.go:25] "Validated CRI v1 image API" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.635406 4837 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.643946 4837 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-14-12-56-22-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.643991 4837 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.676513 4837 manager.go:217] Machine: {Timestamp:2025-10-14 13:01:02.672878143 +0000 UTC m=+0.589878036 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ea84f05e-4f20-4ec0-a4d1-23ededd0f865 BootID:0f27438a-32be-43a2-9e58-7bdea433e25c Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:87:eb:09 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:87:eb:09 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ea:f9:4d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:37:d9:f4 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:58:d6:a2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:44:e3:eb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:ca:89:c3:16:d7 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:82:d6:61:68:ec:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.676878 4837 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.677135 4837 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.678478 4837 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.678820 4837 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.678886 4837 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.679282 4837 topology_manager.go:138] "Creating topology manager with none policy" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.679301 4837 container_manager_linux.go:303] "Creating device plugin manager" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.679957 4837 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.679999 4837 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.680282 4837 state_mem.go:36] "Initialized new in-memory state store" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.680746 4837 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.686074 4837 kubelet.go:418] "Attempting to sync node with API server" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.686106 4837 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.686129 4837 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.686148 4837 kubelet.go:324] "Adding apiserver pod source" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.686193 4837 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.692757 4837 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.695484 4837 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.698303 4837 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.698867 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.699294 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.699441 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.699555 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700392 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700441 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700461 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700479 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700508 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700526 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700543 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700571 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700591 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700612 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700636 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.700655 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.702728 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.703489 4837 server.go:1280] "Started kubelet" Oct 14 13:01:02 crc systemd[1]: Started Kubernetes Kubelet. Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.712045 4837 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.712699 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.712069 4837 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.713286 4837 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717006 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717270 4837 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717539 4837 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717575 4837 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717288 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:47:38.29582973 +0000 UTC Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717808 4837 server.go:460] "Adding debug handlers to kubelet server" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717820 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1326h46m35.57802099s for next certificate rotation Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.717900 4837 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.718450 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.717969 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.66:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186e5d182963d32e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-14 13:01:02.703448878 +0000 UTC m=+0.620448721,LastTimestamp:2025-10-14 13:01:02.703448878 +0000 UTC m=+0.620448721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.721472 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.721630 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.721858 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.66:6443: connect: connection refused" interval="200ms" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.721936 4837 factory.go:153] Registering CRI-O factory Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.721969 4837 factory.go:221] Registration of the crio container factory successfully Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.722076 4837 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.722092 4837 factory.go:55] Registering systemd factory Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.722105 4837 factory.go:221] Registration of the systemd container factory successfully Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.722134 4837 factory.go:103] Registering Raw factory Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.722156 4837 manager.go:1196] Started watching for new ooms in manager Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.723135 4837 manager.go:319] Starting recovery of all containers Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.732058 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.732132 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.732154 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.732206 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.732225 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735550 4837 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735601 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735620 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735637 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735652 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735664 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735678 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735699 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735711 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735728 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735746 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735790 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735806 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735825 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735838 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735850 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735862 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735872 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735885 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735898 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735909 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735923 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735940 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735954 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735971 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735985 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.735998 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736010 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736023 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736037 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736052 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736064 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736093 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736108 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736121 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736133 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736146 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736227 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736245 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736256 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736269 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736281 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736296 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736309 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736321 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736338 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736351 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736365 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736384 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736399 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736413 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736446 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736775 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736791 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736804 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736816 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736829 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736843 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736858 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736871 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736885 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736899 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736913 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736926 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736939 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736953 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736965 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736979 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.736991 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.737005 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.737016 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.737029 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.737044 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738211 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738301 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738326 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738359 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738380 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738406 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738424 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738440 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738464 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738482 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738498 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738522 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738539 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738563 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.738582 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739125 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739177 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739209 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739227 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739243 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739267 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739283 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739304 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739320 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739338 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739362 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739379 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739419 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739449 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739473 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739498 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739522 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739542 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739563 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739581 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739608 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739634 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739651 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739676 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739692 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739710 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739733 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739750 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739773 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739791 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739810 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739831 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739850 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739867 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739888 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739902 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739925 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739940 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739954 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739974 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.739991 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740011 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740040 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740056 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740078 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740093 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740116 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740132 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740147 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740186 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740200 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740220 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740233 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740246 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740263 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740277 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740294 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740308 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740323 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740339 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740353 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740369 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740387 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740401 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740421 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740438 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740451 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740467 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740488 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740506 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740521 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740535 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740552 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740564 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740584 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740598 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740612 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740630 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740643 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740660 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740673 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740687 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740704 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740716 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740729 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740746 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740757 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740775 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740787 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740799 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740817 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740829 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740849 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740865 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740879 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740897 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740911 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740928 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740942 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740958 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740979 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.740994 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741013 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741062 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741077 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741097 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741113 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741131 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741144 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741173 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741192 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741205 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741219 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741239 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741253 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741266 4837 reconstruct.go:97] "Volume reconstruction finished" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741277 4837 reconciler.go:26] "Reconciler: start to sync state" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.741948 4837 manager.go:324] Recovery completed Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.753909 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.755222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.755274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.755291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.756172 4837 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.756192 4837 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.756210 4837 state_mem.go:36] "Initialized new in-memory state store" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.780337 4837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.783202 4837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.783237 4837 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.783264 4837 kubelet.go:2335] "Starting kubelet main sync loop" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.783301 4837 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 14 13:01:02 crc kubenswrapper[4837]: W1014 13:01:02.783783 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.783827 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.784197 4837 policy_none.go:49] "None policy: Start" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.785620 4837 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.785703 4837 state_mem.go:35] "Initializing new in-memory state store" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.818873 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.843289 4837 manager.go:334] "Starting Device Plugin manager" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.843382 4837 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.843404 4837 server.go:79] "Starting device plugin registration server" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.843963 4837 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.843995 4837 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.844806 4837 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.844939 4837 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.844954 4837 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.854777 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.884059 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.884222 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.886669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.886725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.886742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.886945 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.887185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.887234 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888066 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888107 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888144 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888437 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888628 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.888695 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.889744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.889825 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.889855 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.890028 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.890147 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.890212 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.890734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.890768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.890779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891425 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891826 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.891955 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.892008 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.892947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.892981 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.892997 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.893055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.893080 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.893108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.893362 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.893417 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.894357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.894380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.894389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.923036 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.66:6443: connect: connection refused" interval="400ms" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943345 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943404 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943525 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943591 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943643 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943836 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.943987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.944066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.944122 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.944132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.944349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.944393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.945325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.945396 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.945420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:02 crc kubenswrapper[4837]: I1014 13:01:02.945461 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:02 crc kubenswrapper[4837]: E1014 13:01:02.945989 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.66:6443: connect: connection refused" node="crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045653 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045830 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045928 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046058 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046118 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045984 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046128 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.045994 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046330 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046371 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046405 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046433 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046559 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046603 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046639 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046685 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.046769 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.147148 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.149280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.149355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.149381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.149418 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:03 crc kubenswrapper[4837]: E1014 13:01:03.149975 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.66:6443: connect: connection refused" node="crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.214837 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.235069 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.246631 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.265637 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.267218 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2542758945a6b64882417ede7396f1a937ae49bae33293ee895ef6c13af9cf97 WatchSource:0}: Error finding container 2542758945a6b64882417ede7396f1a937ae49bae33293ee895ef6c13af9cf97: Status 404 returned error can't find the container with id 2542758945a6b64882417ede7396f1a937ae49bae33293ee895ef6c13af9cf97 Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.268208 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d1372137400b2d66259ff20231a328d5d1b43b21ce4af7a88abce3ccd0ca9cbc WatchSource:0}: Error finding container d1372137400b2d66259ff20231a328d5d1b43b21ce4af7a88abce3ccd0ca9cbc: Status 404 returned error can't find the container with id d1372137400b2d66259ff20231a328d5d1b43b21ce4af7a88abce3ccd0ca9cbc Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.270991 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a687eb691dfed426d7d2069de11d56ef1a7ccf48e862ee6d3ed9ec3cfae76d65 WatchSource:0}: Error finding container a687eb691dfed426d7d2069de11d56ef1a7ccf48e862ee6d3ed9ec3cfae76d65: Status 404 returned error can't find the container with id a687eb691dfed426d7d2069de11d56ef1a7ccf48e862ee6d3ed9ec3cfae76d65 Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.276942 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.277733 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1e9ef20ad311a5d27b0d33cd8b4bd1071f9ad04d66d54e079731440fa1ef2883 WatchSource:0}: Error finding container 1e9ef20ad311a5d27b0d33cd8b4bd1071f9ad04d66d54e079731440fa1ef2883: Status 404 returned error can't find the container with id 1e9ef20ad311a5d27b0d33cd8b4bd1071f9ad04d66d54e079731440fa1ef2883 Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.293177 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6534add9774806e81285a07176b03883a176994e00795b64d092e4f8ca209198 WatchSource:0}: Error finding container 6534add9774806e81285a07176b03883a176994e00795b64d092e4f8ca209198: Status 404 returned error can't find the container with id 6534add9774806e81285a07176b03883a176994e00795b64d092e4f8ca209198 Oct 14 13:01:03 crc kubenswrapper[4837]: E1014 13:01:03.324740 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.66:6443: connect: connection refused" interval="800ms" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.551202 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.553589 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.553650 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.553665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.553695 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:03 crc kubenswrapper[4837]: E1014 13:01:03.554242 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.66:6443: connect: connection refused" node="crc" Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.636619 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:03 crc kubenswrapper[4837]: E1014 13:01:03.636704 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.713724 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.788546 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e9ef20ad311a5d27b0d33cd8b4bd1071f9ad04d66d54e079731440fa1ef2883"} Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.789653 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a687eb691dfed426d7d2069de11d56ef1a7ccf48e862ee6d3ed9ec3cfae76d65"} Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.790925 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d1372137400b2d66259ff20231a328d5d1b43b21ce4af7a88abce3ccd0ca9cbc"} Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.792101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2542758945a6b64882417ede7396f1a937ae49bae33293ee895ef6c13af9cf97"} Oct 14 13:01:03 crc kubenswrapper[4837]: I1014 13:01:03.793032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6534add9774806e81285a07176b03883a176994e00795b64d092e4f8ca209198"} Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.884251 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:03 crc kubenswrapper[4837]: E1014 13:01:03.884426 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:03 crc kubenswrapper[4837]: W1014 13:01:03.953710 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:03 crc kubenswrapper[4837]: E1014 13:01:03.953801 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:04 crc kubenswrapper[4837]: E1014 13:01:04.126275 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.66:6443: connect: connection refused" interval="1.6s" Oct 14 13:01:04 crc kubenswrapper[4837]: W1014 13:01:04.272898 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:04 crc kubenswrapper[4837]: E1014 13:01:04.273021 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.66:6443: connect: connection refused" logger="UnhandledError" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.354747 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.356974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.357041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.357061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.357095 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:04 crc kubenswrapper[4837]: E1014 13:01:04.357658 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.66:6443: connect: connection refused" node="crc" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.714042 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.800263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.800325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.800345 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.800366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.800346 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.802149 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.802246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.802265 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.803014 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b" exitCode=0 Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.803146 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.803261 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.804645 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.804699 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.804724 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.805541 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5" exitCode=0 Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.805646 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.805827 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.807436 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.807453 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.807705 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.807734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.808727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.808764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.808779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.808882 4837 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="49351a69e42645c69430e052855c74096a9db20b06e7e80794fb86eecafacf38" exitCode=0 Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.808969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"49351a69e42645c69430e052855c74096a9db20b06e7e80794fb86eecafacf38"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.808983 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.810270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.810326 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.810349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.813701 4837 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37" exitCode=0 Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.813757 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37"} Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.813903 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.814956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.814994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:04 crc kubenswrapper[4837]: I1014 13:01:04.815005 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.433041 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:05 crc kubenswrapper[4837]: E1014 13:01:05.446290 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.66:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186e5d182963d32e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-14 13:01:02.703448878 +0000 UTC m=+0.620448721,LastTimestamp:2025-10-14 13:01:02.703448878 +0000 UTC m=+0.620448721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.714620 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.66:6443: connect: connection refused Oct 14 13:01:05 crc kubenswrapper[4837]: E1014 13:01:05.727471 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.66:6443: connect: connection refused" interval="3.2s" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.819638 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86" exitCode=0 Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.819695 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.819997 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.821975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.822019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.822031 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.830033 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b409dc0335a243f5845885ee323a095c53089eef877862b1eecbec9036e0370f"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.830153 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.833957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.834000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.834012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.834951 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.835056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.835113 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.835135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.836792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.836839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.836857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.848666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.848730 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.848744 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.848756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406"} Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.848770 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.853314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.853345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.853357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.959631 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.963667 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.963708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.963720 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:05 crc kubenswrapper[4837]: I1014 13:01:05.963745 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:05 crc kubenswrapper[4837]: E1014 13:01:05.964234 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.66:6443: connect: connection refused" node="crc" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.855436 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd" exitCode=0 Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.855587 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.855514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd"} Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.857008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.857063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.857082 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.860864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0"} Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.860920 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.860992 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.861136 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.861150 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.861363 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.862599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.862652 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.862675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863358 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:06 crc kubenswrapper[4837]: I1014 13:01:06.863556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.868864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139"} Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.868934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01"} Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.868951 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.869010 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.868956 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476"} Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.870639 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.870677 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:07 crc kubenswrapper[4837]: I1014 13:01:07.870694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.432992 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.433464 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.451472 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.880080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae"} Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.880206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823"} Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.880299 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.880342 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.882096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.882155 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.882199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.882592 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.882664 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:08 crc kubenswrapper[4837]: I1014 13:01:08.882687 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.111994 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.112569 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.114594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.114825 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.115069 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.165188 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.166827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.166866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.166876 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.166900 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.883657 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.884970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.885051 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:09 crc kubenswrapper[4837]: I1014 13:01:09.885076 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.538956 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.539234 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.540851 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.541038 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.541116 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.819696 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.886203 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.887280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.887311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:10 crc kubenswrapper[4837]: I1014 13:01:10.887320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:11 crc kubenswrapper[4837]: I1014 13:01:11.824198 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 14 13:01:11 crc kubenswrapper[4837]: I1014 13:01:11.824493 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:11 crc kubenswrapper[4837]: I1014 13:01:11.826222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:11 crc kubenswrapper[4837]: I1014 13:01:11.826285 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:11 crc kubenswrapper[4837]: I1014 13:01:11.826302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:12 crc kubenswrapper[4837]: I1014 13:01:12.130493 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:12 crc kubenswrapper[4837]: I1014 13:01:12.130917 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:12 crc kubenswrapper[4837]: I1014 13:01:12.133278 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:12 crc kubenswrapper[4837]: I1014 13:01:12.133323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:12 crc kubenswrapper[4837]: I1014 13:01:12.133342 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:12 crc kubenswrapper[4837]: E1014 13:01:12.855051 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.340339 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.340714 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.342617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.342687 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.342710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.348036 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.648745 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.656844 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.893627 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.894897 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.894950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:13 crc kubenswrapper[4837]: I1014 13:01:13.894968 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:14 crc kubenswrapper[4837]: I1014 13:01:14.896866 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:14 crc kubenswrapper[4837]: I1014 13:01:14.898494 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:14 crc kubenswrapper[4837]: I1014 13:01:14.898592 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:14 crc kubenswrapper[4837]: I1014 13:01:14.898611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:16 crc kubenswrapper[4837]: W1014 13:01:16.499124 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.499289 4837 trace.go:236] Trace[796182800]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Oct-2025 13:01:06.497) (total time: 10001ms): Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[796182800]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:01:16.499) Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[796182800]: [10.001411726s] [10.001411726s] END Oct 14 13:01:16 crc kubenswrapper[4837]: E1014 13:01:16.499321 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.606067 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44748->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.606224 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44748->192.168.126.11:17697: read: connection reset by peer" Oct 14 13:01:16 crc kubenswrapper[4837]: W1014 13:01:16.647826 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.648000 4837 trace.go:236] Trace[1164672370]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Oct-2025 13:01:06.645) (total time: 10002ms): Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[1164672370]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:01:16.647) Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[1164672370]: [10.002000129s] [10.002000129s] END Oct 14 13:01:16 crc kubenswrapper[4837]: E1014 13:01:16.648040 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 14 13:01:16 crc kubenswrapper[4837]: W1014 13:01:16.686832 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.686971 4837 trace.go:236] Trace[1793192616]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Oct-2025 13:01:06.684) (total time: 10001ms): Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[1793192616]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:01:16.686) Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[1793192616]: [10.001965494s] [10.001965494s] END Oct 14 13:01:16 crc kubenswrapper[4837]: E1014 13:01:16.687010 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.715534 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 14 13:01:16 crc kubenswrapper[4837]: W1014 13:01:16.796667 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.797067 4837 trace.go:236] Trace[466653961]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Oct-2025 13:01:06.794) (total time: 10002ms): Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[466653961]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:01:16.796) Oct 14 13:01:16 crc kubenswrapper[4837]: Trace[466653961]: [10.002570108s] [10.002570108s] END Oct 14 13:01:16 crc kubenswrapper[4837]: E1014 13:01:16.797097 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.881641 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.882088 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.883353 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.883406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.883418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.894775 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.894853 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.903570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.903682 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.903722 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.908175 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0" exitCode=255 Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.908221 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0"} Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.908371 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.909072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.909099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.909109 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:16 crc kubenswrapper[4837]: I1014 13:01:16.909656 4837 scope.go:117] "RemoveContainer" containerID="e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0" Oct 14 13:01:17 crc kubenswrapper[4837]: I1014 13:01:17.912988 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 14 13:01:17 crc kubenswrapper[4837]: I1014 13:01:17.916082 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b"} Oct 14 13:01:17 crc kubenswrapper[4837]: I1014 13:01:17.916248 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:17 crc kubenswrapper[4837]: I1014 13:01:17.917138 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:17 crc kubenswrapper[4837]: I1014 13:01:17.917216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:17 crc kubenswrapper[4837]: I1014 13:01:17.917237 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.433609 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.433748 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.451979 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.918931 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.920398 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.920475 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:18 crc kubenswrapper[4837]: I1014 13:01:18.920501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.547937 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.548199 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.554937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.555238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.555400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.561521 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:20 crc kubenswrapper[4837]: I1014 13:01:20.905698 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.699334 4837 apiserver.go:52] "Watching apiserver" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.699377 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.705322 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.705808 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.706368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.706588 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:21 crc kubenswrapper[4837]: E1014 13:01:21.706703 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.706587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.706507 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:21 crc kubenswrapper[4837]: E1014 13:01:21.706912 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.706942 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.706959 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:21 crc kubenswrapper[4837]: E1014 13:01:21.707043 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.709099 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.709805 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.709823 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.710839 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.710992 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.711130 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.711193 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.712559 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.714014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.719016 4837 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.752721 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.773687 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.789514 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.804396 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.819309 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.836076 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.851776 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.872441 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.888753 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:21 crc kubenswrapper[4837]: E1014 13:01:21.891149 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.894633 4837 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 14 13:01:21 crc kubenswrapper[4837]: E1014 13:01:21.895978 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.921983 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.995785 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.995841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.995873 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.995904 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996396 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996413 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996508 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996544 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996655 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996712 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996759 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996887 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996943 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.996992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997011 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997084 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997125 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997188 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997220 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997251 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997280 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997311 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997340 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997398 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997429 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997433 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997579 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997638 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997688 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997725 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997796 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997842 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997946 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998010 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998044 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.997982 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998105 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998192 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998232 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998266 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998328 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998360 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998420 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998451 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998484 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998548 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998583 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998612 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998642 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998639 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998673 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998736 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998764 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998793 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998850 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998881 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998938 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998968 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.998994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999020 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999043 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999071 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999100 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999125 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999149 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999249 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999289 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999327 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999359 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999389 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 14 13:01:21 crc kubenswrapper[4837]: I1014 13:01:21.999459 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999516 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999547 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999577 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999611 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999641 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999778 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999807 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999839 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999870 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999901 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999930 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999995 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000031 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000061 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000093 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000121 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000154 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000209 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000241 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000272 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000302 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000337 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000366 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000467 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000543 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000579 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000640 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000671 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000704 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000733 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000766 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000797 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000829 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000861 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000894 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000924 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000955 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000982 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001044 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001076 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001107 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001135 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001307 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001340 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001367 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001437 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001473 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001508 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001571 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001638 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001670 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001727 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001756 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001786 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001817 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001879 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001909 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001968 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002056 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002089 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002152 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002252 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002287 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002319 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002352 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002416 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002448 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002478 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002511 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002594 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002626 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002660 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002695 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002728 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002761 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002797 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002831 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002865 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002902 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.002971 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003007 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003107 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003142 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003195 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003291 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003328 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003361 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003397 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003430 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003465 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003501 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003533 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003567 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003600 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003634 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003667 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003762 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003844 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003915 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003949 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004021 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004091 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004123 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004178 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004302 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004326 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004344 4837 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004364 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004382 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004402 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004422 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004441 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004460 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004478 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004496 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004513 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004531 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004553 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005960 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999283 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999345 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999351 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999508 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999693 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999692 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999847 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:21.999932 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000132 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000194 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000285 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.020234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000381 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000644 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000665 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000652 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000692 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000840 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000991 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.000892 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001000 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001101 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001133 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001122 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001374 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001469 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.001810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003798 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.003812 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004275 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.004765 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005044 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005391 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005520 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005874 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005953 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.005950 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.006001 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.006281 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.007521 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.007560 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.007601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.008061 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.008100 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.008244 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.008490 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.008886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.009010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.009499 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.009750 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.010011 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.010069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.009714 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.010409 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.012049 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.012799 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.012967 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.013361 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.013500 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.013767 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.013944 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.013839 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.013967 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.014934 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015043 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.014882 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015321 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015652 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015799 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015906 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.015862 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.016003 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.016039 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:01:22.516008744 +0000 UTC m=+20.433008637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.017698 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.017726 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.017769 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.017352 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.018228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.018462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.018610 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.018807 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.018818 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.019099 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.019741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.020198 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.020767 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.020800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.020530 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.021154 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.021294 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.021327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.021333 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.021425 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.021768 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.022205 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.022264 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.022313 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.022355 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.023259 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.022514 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.023744 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.023863 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.024518 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.024855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.025040 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.025244 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.025425 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.025685 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.025724 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.025971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.026076 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.026564 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.026959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.027292 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.027878 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.028234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.028404 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.028466 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.028843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.028889 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029149 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029194 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029186 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029391 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029612 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.029627 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.030152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.030410 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.030622 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.030971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.031614 4837 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.031695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.032277 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:22.532249717 +0000 UTC m=+20.449249540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.032308 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:22.532295778 +0000 UTC m=+20.449295591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.034970 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.035420 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.035665 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.035788 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.037146 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.038270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.038600 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.039109 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.042476 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.045423 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.047114 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.047152 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.047198 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.047299 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:22.5472752 +0000 UTC m=+20.464275043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.049050 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.049834 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.049868 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.049889 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.050007 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:22.549980762 +0000 UTC m=+20.466980615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.053489 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.055705 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.055951 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.056421 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.056890 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.057647 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.057802 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.057796 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.058338 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.058463 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.058571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.059399 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.059535 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.061788 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.061936 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.062074 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.064542 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.064829 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.064840 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.065059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.065203 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.065308 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.065646 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.065803 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.066104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.069198 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.069650 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.080130 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.080146 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.083054 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.083425 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.083783 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.084076 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.084214 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.084238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.086472 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.086675 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.086892 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.087199 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.094015 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.102364 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.103095 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105237 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105380 4837 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105404 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105423 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105444 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105468 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105490 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105507 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105524 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105541 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105558 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105590 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105604 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105613 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105622 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105812 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105822 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105831 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105840 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105849 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105858 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105867 4837 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105876 4837 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105886 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105895 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105903 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105912 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105921 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105929 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105938 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105948 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105957 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105967 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105975 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105985 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.105994 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106005 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106014 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106024 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106033 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106043 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106053 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106062 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106072 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106081 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106090 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106099 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106108 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106117 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106127 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106136 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106144 4837 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106152 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106183 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106191 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106201 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106210 4837 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106218 4837 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106228 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106236 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106245 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106253 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106262 4837 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106271 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106280 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106289 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106297 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106305 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106314 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106323 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106331 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106340 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106350 4837 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106358 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106366 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106375 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106383 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106392 4837 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106401 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106410 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106420 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106429 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106437 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106445 4837 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106453 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106460 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106469 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106478 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106486 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106494 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106502 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106511 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106519 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106527 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106536 4837 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106567 4837 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106576 4837 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106584 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106592 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106600 4837 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106609 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106618 4837 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106627 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106636 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106644 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106652 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106816 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106825 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106836 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106844 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106854 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106863 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106873 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106884 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106895 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106906 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106917 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106927 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106936 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106944 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106952 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106961 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106977 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106985 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.106993 4837 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107001 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107009 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107018 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107026 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107034 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107042 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107050 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107058 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107066 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107075 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107084 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107093 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107102 4837 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107110 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107118 4837 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107126 4837 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107134 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107143 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107151 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107174 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107182 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107190 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107199 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107207 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107216 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107225 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107233 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107241 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107249 4837 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107257 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107265 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107273 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107281 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107289 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107297 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107304 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107313 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107321 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107329 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107337 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107346 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107355 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107365 4837 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107372 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107381 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107389 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107397 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107405 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107414 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107422 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107431 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107439 4837 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107448 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107456 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.107464 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.111924 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.127822 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.207869 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.330577 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.345908 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.355931 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 14 13:01:22 crc kubenswrapper[4837]: W1014 13:01:22.357962 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-906e7b20ad7f63e6a314ece77a4fb56de349d291a21f5f71d5b97d951d33d863 WatchSource:0}: Error finding container 906e7b20ad7f63e6a314ece77a4fb56de349d291a21f5f71d5b97d951d33d863: Status 404 returned error can't find the container with id 906e7b20ad7f63e6a314ece77a4fb56de349d291a21f5f71d5b97d951d33d863 Oct 14 13:01:22 crc kubenswrapper[4837]: W1014 13:01:22.368560 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e1fc6794accddc1c3772a02a7277f241d82fff7e20a107d71dbe2fd52c8917ea WatchSource:0}: Error finding container e1fc6794accddc1c3772a02a7277f241d82fff7e20a107d71dbe2fd52c8917ea: Status 404 returned error can't find the container with id e1fc6794accddc1c3772a02a7277f241d82fff7e20a107d71dbe2fd52c8917ea Oct 14 13:01:22 crc kubenswrapper[4837]: W1014 13:01:22.370878 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f21e0a1b734f4f0b6f4a52cb9ebd7e4152040fdbecdee9c518378118cfebc72b WatchSource:0}: Error finding container f21e0a1b734f4f0b6f4a52cb9ebd7e4152040fdbecdee9c518378118cfebc72b: Status 404 returned error can't find the container with id f21e0a1b734f4f0b6f4a52cb9ebd7e4152040fdbecdee9c518378118cfebc72b Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.611797 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612100 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:01:23.612061628 +0000 UTC m=+21.529061481 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.612251 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.612311 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.612363 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.612404 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612556 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612604 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612623 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612606 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612788 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612630 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612857 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612795 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:23.612774287 +0000 UTC m=+21.529774120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612916 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:23.612898361 +0000 UTC m=+21.529898214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612942 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:23.612930022 +0000 UTC m=+21.529929875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612566 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: E1014 13:01:22.612985 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:23.612973663 +0000 UTC m=+21.529973516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.790641 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.791375 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.792047 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.792692 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.793325 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.793850 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.796056 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.797300 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.798608 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.799948 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.802389 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.803833 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.805758 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.807017 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.810417 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.811343 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.811682 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.815120 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.816752 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.823669 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.826328 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.826882 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.828089 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.830709 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.831753 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.834137 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.835129 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.837524 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.839197 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.840274 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.842581 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.843796 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.845056 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.845703 4837 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.845915 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.849715 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.851981 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.853248 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.856971 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.860365 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.861866 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.864798 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.866924 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.868381 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.869201 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.870536 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.871031 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.871424 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.872579 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.873315 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.874542 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.875540 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.877549 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.878178 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.878819 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.880032 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.880787 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.882011 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.899596 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.950458 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.951451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d"} Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.951517 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b"} Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.951529 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e1fc6794accddc1c3772a02a7277f241d82fff7e20a107d71dbe2fd52c8917ea"} Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.955777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e"} Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.955840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"906e7b20ad7f63e6a314ece77a4fb56de349d291a21f5f71d5b97d951d33d863"} Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.957143 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f21e0a1b734f4f0b6f4a52cb9ebd7e4152040fdbecdee9c518378118cfebc72b"} Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.970521 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:22 crc kubenswrapper[4837]: I1014 13:01:22.988578 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.003987 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.014520 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.029387 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.042784 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.054632 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.066483 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.624256 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.624316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.624349 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.624366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.624387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624454 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:01:25.624426732 +0000 UTC m=+23.541426545 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624482 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624497 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624507 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624547 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624627 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624549 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:25.624535835 +0000 UTC m=+23.541535648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624654 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:25.624647898 +0000 UTC m=+23.541647711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624665 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:25.624660548 +0000 UTC m=+23.541660351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624715 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624724 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624734 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.624772 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:25.624755651 +0000 UTC m=+23.541755454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.783792 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.783907 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.783957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.783997 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:23 crc kubenswrapper[4837]: I1014 13:01:23.784035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:23 crc kubenswrapper[4837]: E1014 13:01:23.784071 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.027746 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-l7bgt"] Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.028067 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.030629 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.030973 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.031021 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.043299 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.068068 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.084081 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.096861 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.111391 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.122507 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.128490 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ld24\" (UniqueName: \"kubernetes.io/projected/e4ade557-3f1e-4a87-8269-24f33cdafcef-kube-api-access-4ld24\") pod \"node-resolver-l7bgt\" (UID: \"e4ade557-3f1e-4a87-8269-24f33cdafcef\") " pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.128559 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e4ade557-3f1e-4a87-8269-24f33cdafcef-hosts-file\") pod \"node-resolver-l7bgt\" (UID: \"e4ade557-3f1e-4a87-8269-24f33cdafcef\") " pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.136872 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.158447 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.229129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ld24\" (UniqueName: \"kubernetes.io/projected/e4ade557-3f1e-4a87-8269-24f33cdafcef-kube-api-access-4ld24\") pod \"node-resolver-l7bgt\" (UID: \"e4ade557-3f1e-4a87-8269-24f33cdafcef\") " pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.229233 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e4ade557-3f1e-4a87-8269-24f33cdafcef-hosts-file\") pod \"node-resolver-l7bgt\" (UID: \"e4ade557-3f1e-4a87-8269-24f33cdafcef\") " pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.229354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e4ade557-3f1e-4a87-8269-24f33cdafcef-hosts-file\") pod \"node-resolver-l7bgt\" (UID: \"e4ade557-3f1e-4a87-8269-24f33cdafcef\") " pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.256979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ld24\" (UniqueName: \"kubernetes.io/projected/e4ade557-3f1e-4a87-8269-24f33cdafcef-kube-api-access-4ld24\") pod \"node-resolver-l7bgt\" (UID: \"e4ade557-3f1e-4a87-8269-24f33cdafcef\") " pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.339413 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l7bgt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.424390 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-r24ng"] Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.425222 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-s6qr4"] Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.425342 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.425827 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-h4ggd"] Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.426388 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.428017 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.429718 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.430025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.430223 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.430410 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.430656 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.430827 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.431044 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.431236 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.431821 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.431823 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.432637 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.433185 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.444832 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.482811 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.497988 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.511592 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.524873 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532617 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-system-cni-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532696 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532718 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-hostroot\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532742 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwb87\" (UniqueName: \"kubernetes.io/projected/01492025-d672-4746-af22-53fa41a3f612-kube-api-access-dwb87\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532778 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/724908de-ffce-4ba4-8695-c9757f3b9b73-cni-binary-copy\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532800 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-cni-bin\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532822 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-socket-dir-parent\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532846 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-k8s-cni-cncf-io\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532869 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/01492025-d672-4746-af22-53fa41a3f612-multus-daemon-config\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532889 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-proxy-tls\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532909 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-cnibin\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.532983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-rootfs\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2nhc\" (UniqueName: \"kubernetes.io/projected/724908de-ffce-4ba4-8695-c9757f3b9b73-kube-api-access-z2nhc\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-os-release\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-os-release\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533215 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-netns\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-etc-kubernetes\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533310 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-conf-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/724908de-ffce-4ba4-8695-c9757f3b9b73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533358 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-multus-certs\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533381 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-cni-multus\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01492025-d672-4746-af22-53fa41a3f612-cni-binary-copy\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533430 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-kubelet\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533461 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-cnibin\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533483 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xf45\" (UniqueName: \"kubernetes.io/projected/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-kube-api-access-6xf45\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-system-cni-dir\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.533524 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-cni-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.542545 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.563416 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.573865 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.584621 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.598470 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-conf-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-etc-kubernetes\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/724908de-ffce-4ba4-8695-c9757f3b9b73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-cni-multus\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-multus-certs\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-conf-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01492025-d672-4746-af22-53fa41a3f612-cni-binary-copy\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-kubelet\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634895 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-etc-kubernetes\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634986 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-cnibin\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634982 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-kubelet\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-cni-multus\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-cnibin\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.634990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-multus-certs\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xf45\" (UniqueName: \"kubernetes.io/projected/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-kube-api-access-6xf45\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635135 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-system-cni-dir\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635184 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-cni-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635207 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635251 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-system-cni-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635287 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/724908de-ffce-4ba4-8695-c9757f3b9b73-cni-binary-copy\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-hostroot\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635340 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwb87\" (UniqueName: \"kubernetes.io/projected/01492025-d672-4746-af22-53fa41a3f612-kube-api-access-dwb87\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635368 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-socket-dir-parent\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-k8s-cni-cncf-io\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-system-cni-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635405 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-cni-bin\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-proxy-tls\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/01492025-d672-4746-af22-53fa41a3f612-multus-daemon-config\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-cnibin\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2nhc\" (UniqueName: \"kubernetes.io/projected/724908de-ffce-4ba4-8695-c9757f3b9b73-kube-api-access-z2nhc\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635490 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-rootfs\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635517 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-os-release\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-netns\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-cni-dir\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-os-release\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635719 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/724908de-ffce-4ba4-8695-c9757f3b9b73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-os-release\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635761 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01492025-d672-4746-af22-53fa41a3f612-cni-binary-copy\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636117 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-hostroot\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/724908de-ffce-4ba4-8695-c9757f3b9b73-cni-binary-copy\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.635424 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-system-cni-dir\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-rootfs\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636273 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-os-release\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-netns\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-cnibin\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-run-k8s-cni-cncf-io\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636337 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-multus-socket-dir-parent\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01492025-d672-4746-af22-53fa41a3f612-host-var-lib-cni-bin\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636550 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/01492025-d672-4746-af22-53fa41a3f612-multus-daemon-config\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.636707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/724908de-ffce-4ba4-8695-c9757f3b9b73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.637600 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.641749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-proxy-tls\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.659637 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xf45\" (UniqueName: \"kubernetes.io/projected/d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3-kube-api-access-6xf45\") pod \"machine-config-daemon-h4ggd\" (UID: \"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\") " pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.664606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwb87\" (UniqueName: \"kubernetes.io/projected/01492025-d672-4746-af22-53fa41a3f612-kube-api-access-dwb87\") pod \"multus-s6qr4\" (UID: \"01492025-d672-4746-af22-53fa41a3f612\") " pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.668487 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.700634 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.730306 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.742191 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.755199 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.764692 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6qr4" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.766263 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.774225 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.775512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2nhc\" (UniqueName: \"kubernetes.io/projected/724908de-ffce-4ba4-8695-c9757f3b9b73-kube-api-access-z2nhc\") pod \"multus-additional-cni-plugins-r24ng\" (UID: \"724908de-ffce-4ba4-8695-c9757f3b9b73\") " pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.779825 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: W1014 13:01:24.780543 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01492025_d672_4746_af22_53fa41a3f612.slice/crio-32db27140e946224e4299febaa9c262411871a6b91fb5ca90ce444c39791c0a3 WatchSource:0}: Error finding container 32db27140e946224e4299febaa9c262411871a6b91fb5ca90ce444c39791c0a3: Status 404 returned error can't find the container with id 32db27140e946224e4299febaa9c262411871a6b91fb5ca90ce444c39791c0a3 Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.795922 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.800902 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xfw4j"] Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.801868 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.804548 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.806285 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.806355 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.806442 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.806485 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.806512 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.806972 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.809325 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.829670 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.849146 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.858659 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.873992 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.891873 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.910433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.933895 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.942865 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-var-lib-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.942912 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-script-lib\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943001 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943056 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f670a3c6-520c-45ba-980a-00c63703b02b-ovn-node-metrics-cert\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943098 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-etc-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-log-socket\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943140 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-systemd\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943200 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-netd\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-env-overrides\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943274 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-systemd-units\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6t64\" (UniqueName: \"kubernetes.io/projected/f670a3c6-520c-45ba-980a-00c63703b02b-kube-api-access-j6t64\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-kubelet\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943395 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-node-log\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-ovn\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943517 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-config\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-bin\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-slash\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.943656 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-netns\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.949662 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.963867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l7bgt" event={"ID":"e4ade557-3f1e-4a87-8269-24f33cdafcef","Type":"ContainerStarted","Data":"f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.963932 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l7bgt" event={"ID":"e4ade557-3f1e-4a87-8269-24f33cdafcef","Type":"ContainerStarted","Data":"30b2a0d714419e94e6f441a55bf39b6ccfa17d2809ba125aa6f427a4ddcc1f4a"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.964024 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.965526 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.965564 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.965575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"0d4112dcd937b3279f802b0a4c862f777c743e72f4ad4482e8f46dba03f8be34"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.967266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerStarted","Data":"ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.967352 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerStarted","Data":"32db27140e946224e4299febaa9c262411871a6b91fb5ca90ce444c39791c0a3"} Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.978290 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:24 crc kubenswrapper[4837]: I1014 13:01:24.993080 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:24Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.007653 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.021912 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.036827 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-bin\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044646 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-slash\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044693 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-bin\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044762 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-netns\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-slash\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-netns\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-var-lib-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.044759 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r24ng" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-var-lib-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-script-lib\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045844 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-log-socket\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045861 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045943 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-log-socket\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045775 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-script-lib\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.045877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f670a3c6-520c-45ba-980a-00c63703b02b-ovn-node-metrics-cert\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-etc-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-etc-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-systemd\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-systemd\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-netd\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046358 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-openvswitch\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046388 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-netd\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047299 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-env-overrides\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.046389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-env-overrides\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047413 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-systemd-units\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047457 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-systemd-units\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6t64\" (UniqueName: \"kubernetes.io/projected/f670a3c6-520c-45ba-980a-00c63703b02b-kube-api-access-j6t64\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047552 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-kubelet\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047567 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-node-log\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-kubelet\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-config\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.047983 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-ovn\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.048026 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-node-log\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.048605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-config\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.048629 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-ovn\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.049787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f670a3c6-520c-45ba-980a-00c63703b02b-ovn-node-metrics-cert\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.053236 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: W1014 13:01:25.058683 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724908de_ffce_4ba4_8695_c9757f3b9b73.slice/crio-1c0711c62db8edb7d7ff8c1d6a2bfd97e930309596b604c11770c9ca282c4de7 WatchSource:0}: Error finding container 1c0711c62db8edb7d7ff8c1d6a2bfd97e930309596b604c11770c9ca282c4de7: Status 404 returned error can't find the container with id 1c0711c62db8edb7d7ff8c1d6a2bfd97e930309596b604c11770c9ca282c4de7 Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.064919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6t64\" (UniqueName: \"kubernetes.io/projected/f670a3c6-520c-45ba-980a-00c63703b02b-kube-api-access-j6t64\") pod \"ovnkube-node-xfw4j\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.068590 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.097071 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.112853 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.113955 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.130955 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.154778 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.187226 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.219448 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.236275 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.256222 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.437642 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.440714 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.449794 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.461119 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.479500 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.497512 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.516224 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.530703 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.540659 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.556962 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.571821 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.588327 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.609922 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.628237 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.643335 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.654413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.654541 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.654577 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.654603 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.654629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.654714 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.654771 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:29.654755319 +0000 UTC m=+27.571755142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.654916 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.654989 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655060 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655103 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655228 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655283 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655096 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655010 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:01:29.654977446 +0000 UTC m=+27.571977259 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655546 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:29.65551173 +0000 UTC m=+27.572511553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655573 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:29.655564111 +0000 UTC m=+27.572563934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.655595 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:29.655586942 +0000 UTC m=+27.572586775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.657381 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.668641 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.682869 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.699011 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.710446 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.723454 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.735483 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.749050 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.763915 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.777734 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.784180 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.784180 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.784316 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.784428 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.784570 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:25 crc kubenswrapper[4837]: E1014 13:01:25.784751 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.789309 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.801672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.818000 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.971686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7"} Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.973645 4837 generic.go:334] "Generic (PLEG): container finished" podID="724908de-ffce-4ba4-8695-c9757f3b9b73" containerID="71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e" exitCode=0 Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.973797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerDied","Data":"71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e"} Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.973840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerStarted","Data":"1c0711c62db8edb7d7ff8c1d6a2bfd97e930309596b604c11770c9ca282c4de7"} Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.976615 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" exitCode=0 Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.976654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.976707 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"6ce098dad1f402fc79fe5ee600058e8d4dab7229ca91c0690f68b99d6b60e4e0"} Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.981877 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:25 crc kubenswrapper[4837]: I1014 13:01:25.996295 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:25Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.007784 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.025538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.037353 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.048922 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.064124 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.084912 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.104853 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.117639 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.143355 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.159217 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.199551 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.249840 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.282524 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.323549 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.364016 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.401619 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.449196 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.480319 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.516723 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.564272 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.627716 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.658417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.684714 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.724207 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.911114 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.923389 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.927033 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.928038 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.941237 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.955497 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.968047 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.983421 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.989267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.989313 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.989325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.989335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.989345 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.989355 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.991961 4837 generic.go:334] "Generic (PLEG): container finished" podID="724908de-ffce-4ba4-8695-c9757f3b9b73" containerID="1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28" exitCode=0 Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.992023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerDied","Data":"1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28"} Oct 14 13:01:26 crc kubenswrapper[4837]: I1014 13:01:26.997819 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:26Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.018207 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.061090 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.107096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.143061 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.182098 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.223368 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.259755 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.300052 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.344175 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.384582 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.419537 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.462200 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.505558 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.542569 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.581738 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.618264 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.663139 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.703423 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.739451 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.783008 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.784122 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.784289 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:27 crc kubenswrapper[4837]: E1014 13:01:27.784314 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.784395 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:27 crc kubenswrapper[4837]: E1014 13:01:27.784508 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:27 crc kubenswrapper[4837]: E1014 13:01:27.784754 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.826567 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.857872 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-q2xkc"] Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.858367 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.860537 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.871870 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.890535 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.910204 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.941732 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.981201 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-serviceca\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.981470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmwh\" (UniqueName: \"kubernetes.io/projected/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-kube-api-access-2hmwh\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.981584 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-host\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.985870 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:27Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.997919 4837 generic.go:334] "Generic (PLEG): container finished" podID="724908de-ffce-4ba4-8695-c9757f3b9b73" containerID="654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500" exitCode=0 Oct 14 13:01:27 crc kubenswrapper[4837]: I1014 13:01:27.998047 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerDied","Data":"654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.018783 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.060623 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.082376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-serviceca\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.082423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmwh\" (UniqueName: \"kubernetes.io/projected/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-kube-api-access-2hmwh\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.082441 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-host\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.082536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-host\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.083518 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-serviceca\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.120435 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.174983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmwh\" (UniqueName: \"kubernetes.io/projected/ba3e1251-eebb-4db2-8db1-1d8c63a7660b-kube-api-access-2hmwh\") pod \"node-ca-q2xkc\" (UID: \"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\") " pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.180847 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.181230 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q2xkc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.200058 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.247787 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.277176 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.296578 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.298220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.298250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.298260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.298339 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.319129 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.371137 4837 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.371929 4837 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.372976 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.373010 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.373019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.373036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.373046 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: E1014 13:01:28.389255 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.393135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.393195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.393205 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.393222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.393232 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.396605 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: E1014 13:01:28.406469 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.409721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.409750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.409759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.409851 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.409861 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: E1014 13:01:28.420635 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.423634 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.423665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.423673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.423688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.423697 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: E1014 13:01:28.435237 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.438755 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.438794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.438805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.438823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.438836 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.446136 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: E1014 13:01:28.449067 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: E1014 13:01:28.449187 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.451277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.451306 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.451315 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.451328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.451337 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.458206 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.479899 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.518453 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.553252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.553325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.553346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.553381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.553404 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.559385 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.599372 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.638923 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.655514 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.655558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.655569 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.655585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.655597 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.679926 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.724479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.762243 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.763732 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.763797 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.763814 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.763831 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.763845 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.799514 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.841036 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.866193 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.866233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.866281 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.866302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.866313 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.892975 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.926222 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.964357 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:28Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.969573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.969627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.969644 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.969669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:28 crc kubenswrapper[4837]: I1014 13:01:28.969689 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:28Z","lastTransitionTime":"2025-10-14T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.005135 4837 generic.go:334] "Generic (PLEG): container finished" podID="724908de-ffce-4ba4-8695-c9757f3b9b73" containerID="a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067" exitCode=0 Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.005276 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerDied","Data":"a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.009104 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q2xkc" event={"ID":"ba3e1251-eebb-4db2-8db1-1d8c63a7660b","Type":"ContainerStarted","Data":"6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.009207 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q2xkc" event={"ID":"ba3e1251-eebb-4db2-8db1-1d8c63a7660b","Type":"ContainerStarted","Data":"c54ee19afc2fdbd87e57ea22f1a15fe98332aeb95071a574b225ef541bd8b35c"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.011674 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.048065 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.074744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.074835 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.074881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.074917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.074941 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.080023 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.131633 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.161404 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.177190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.177243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.177255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.177272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.177609 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.202091 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.238617 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.278250 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.279719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.279784 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.279809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.279840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.279915 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.327488 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.361916 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.382461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.382496 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.382506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.382522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.382537 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.402106 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.441218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.479561 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.485551 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.485599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.485610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.485627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.485639 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.519802 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.557043 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.594177 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.594211 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.594222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.594236 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.594245 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.602147 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.639370 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.678713 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.697759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.697799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.697809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.697824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.697835 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.698696 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699093 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:01:37.699070148 +0000 UTC m=+35.616069961 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.699253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.699289 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.699315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.699351 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699470 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699484 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699496 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699530 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:37.69952103 +0000 UTC m=+35.616520843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699931 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.699969 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:37.699958381 +0000 UTC m=+35.616958194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.700019 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.700032 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.700042 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.700067 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:37.700058044 +0000 UTC m=+35.617057857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.700099 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.700123 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:37.700115797 +0000 UTC m=+35.617115610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.720251 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.769793 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.783973 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.784002 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.784098 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.784204 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.784411 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:29 crc kubenswrapper[4837]: E1014 13:01:29.784622 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.800527 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.800551 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.800559 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.800572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.800581 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.903635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.903668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.903679 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.903694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:29 crc kubenswrapper[4837]: I1014 13:01:29.903705 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:29Z","lastTransitionTime":"2025-10-14T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.006962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.007005 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.007017 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.007040 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.007053 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.013978 4837 generic.go:334] "Generic (PLEG): container finished" podID="724908de-ffce-4ba4-8695-c9757f3b9b73" containerID="da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43" exitCode=0 Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.014043 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerDied","Data":"da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.018340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.037350 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.068270 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.085422 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.101268 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.120837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.120893 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.120909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.120937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.120954 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.129228 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.149637 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.166969 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.194713 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.219456 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.223394 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.223446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.223467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.223494 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.223512 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.235221 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.250062 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.268053 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.281078 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.318975 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.325603 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.325666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.325693 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.325735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.325761 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.357942 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.427783 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.427827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.427839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.427858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.427871 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.530919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.530971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.530989 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.531019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.531037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.633873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.633929 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.633946 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.633971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.633988 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.736525 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.736583 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.736604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.736635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.736657 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.839651 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.839709 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.839727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.839754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.839775 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.943495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.943541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.943558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.943581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:30 crc kubenswrapper[4837]: I1014 13:01:30.943598 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:30Z","lastTransitionTime":"2025-10-14T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.027071 4837 generic.go:334] "Generic (PLEG): container finished" podID="724908de-ffce-4ba4-8695-c9757f3b9b73" containerID="9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58" exitCode=0 Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.027130 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerDied","Data":"9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.046719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.046754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.046763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.046776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.046785 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.051558 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.070043 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.087603 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.104207 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.119907 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.135796 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.150580 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.150626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.150649 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.150677 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.150698 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.163524 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.190769 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.215606 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.233328 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.251023 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.254541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.254596 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.254622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.254653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.254679 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.267659 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.282929 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.299376 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.316739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:31Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.356952 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.356984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.356993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.357007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.357019 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.460360 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.460420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.460437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.460462 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.460479 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.563314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.563379 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.563397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.563424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.563443 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.666677 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.666787 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.666807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.666836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.666857 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.770088 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.770208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.770236 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.770268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.770294 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.783966 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.784033 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.784048 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:31 crc kubenswrapper[4837]: E1014 13:01:31.784133 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:31 crc kubenswrapper[4837]: E1014 13:01:31.784316 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:31 crc kubenswrapper[4837]: E1014 13:01:31.784399 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.873340 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.873409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.873427 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.873452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.873471 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.977221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.977338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.977359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.977389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:31 crc kubenswrapper[4837]: I1014 13:01:31.977407 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:31Z","lastTransitionTime":"2025-10-14T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.042701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.042977 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.043009 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.048468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" event={"ID":"724908de-ffce-4ba4-8695-c9757f3b9b73","Type":"ContainerStarted","Data":"90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.066994 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.080312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.080367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.080384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.080409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.080426 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.082956 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.083390 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.095093 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.114951 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.162152 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.182982 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.183014 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.183024 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.183038 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.183047 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.183360 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.200735 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.212457 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.230234 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.250545 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.262829 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.275862 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.285221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.285260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.285273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.285300 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.285313 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.293555 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.307082 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.317196 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.332129 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.351611 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.369964 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.388575 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.388669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.388688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.388713 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.388731 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.389073 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.407537 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.424755 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.439016 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.456302 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.471063 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.490985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.491050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.491067 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.491090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.491118 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.501818 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.537552 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.558065 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.576835 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.594215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.594346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.594365 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.594392 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.594410 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.600830 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.622654 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.639202 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.697341 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.697410 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.697427 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.697471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.697489 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.801262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.801341 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.801364 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.801393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.801419 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.814087 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.834947 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.854098 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.877954 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.896347 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.904451 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.904487 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.904499 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.904517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.904529 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:32Z","lastTransitionTime":"2025-10-14T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.916077 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.933523 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.963410 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.980321 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:32 crc kubenswrapper[4837]: I1014 13:01:32.997125 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:32Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.006820 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.006847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.006856 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.006871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.006883 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.011314 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:33Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.021540 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:33Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.041128 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:33Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.051193 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.057941 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:33Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.074814 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:33Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.109850 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.109906 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.109927 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.109957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.109979 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.212369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.212518 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.212545 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.212576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.212600 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.315604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.315665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.315681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.315706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.315723 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.418533 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.418577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.418590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.418607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.418623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.521826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.521891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.521910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.521933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.521950 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.625366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.625436 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.625459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.625488 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.625511 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.728849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.728907 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.728925 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.728948 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.728967 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.783755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:33 crc kubenswrapper[4837]: E1014 13:01:33.783886 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.783755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.783767 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:33 crc kubenswrapper[4837]: E1014 13:01:33.784035 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:33 crc kubenswrapper[4837]: E1014 13:01:33.784099 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.831895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.831947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.831961 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.831979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.831993 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.934703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.934766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.934786 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.934809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:33 crc kubenswrapper[4837]: I1014 13:01:33.934828 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:33Z","lastTransitionTime":"2025-10-14T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.037290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.037369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.037394 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.037438 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.037491 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.054389 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.140924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.140989 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.141007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.141033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.141055 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.244459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.244520 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.244537 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.244562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.244580 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.347427 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.347466 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.347475 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.347490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.347499 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.449760 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.449822 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.449839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.449863 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.449880 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.551985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.552024 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.552033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.552080 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.552102 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.654129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.654434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.654546 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.654698 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.654805 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.757667 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.758058 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.758299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.758464 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.758626 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.861889 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.861956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.861984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.862013 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.862038 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.965148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.965230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.965247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.965279 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:34 crc kubenswrapper[4837]: I1014 13:01:34.965296 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:34Z","lastTransitionTime":"2025-10-14T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.060842 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/0.log" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.064873 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411" exitCode=1 Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.064930 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.066095 4837 scope.go:117] "RemoveContainer" containerID="0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.067402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.067452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.067470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.067495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.067512 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.101302 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.125739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.145590 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.169726 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.169763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.169773 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.169788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.169800 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.172398 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.185490 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.196758 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.214533 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.229345 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.242743 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.255880 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.269086 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.272346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.272394 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.272426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.272445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.272458 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.281079 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.304474 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.325622 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.346271 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:34Z\\\",\\\"message\\\":\\\"I1014 13:01:34.471301 6133 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1014 13:01:34.471347 6133 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1014 13:01:34.471399 6133 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1014 13:01:34.471589 6133 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1014 13:01:34.471644 6133 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1014 13:01:34.471659 6133 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1014 13:01:34.471744 6133 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1014 13:01:34.471776 6133 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1014 13:01:34.471739 6133 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1014 13:01:34.471805 6133 handler.go:208] Removed *v1.Node event handler 2\\\\nI1014 13:01:34.471835 6133 factory.go:656] Stopping watch factory\\\\nI1014 13:01:34.471855 6133 ovnkube.go:599] Stopped ovnkube\\\\nI1014 13:01:34.471856 6133 handler.go:208] Removed *v1.Node event handler 7\\\\nI1014 13:01:34.471862 6133 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1014 13:01:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:35Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.374497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.374550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.374567 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.374591 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.374609 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.477640 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.477706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.477723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.477748 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.477768 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.580704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.580760 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.580779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.580804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.580820 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.683309 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.683371 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.683388 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.683411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.683426 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.783442 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.783491 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.783457 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:35 crc kubenswrapper[4837]: E1014 13:01:35.783562 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:35 crc kubenswrapper[4837]: E1014 13:01:35.783680 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:35 crc kubenswrapper[4837]: E1014 13:01:35.783777 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.785021 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.785055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.785070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.785085 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.785096 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.887419 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.887468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.887483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.887503 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.887518 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.989515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.989558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.989573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.989594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:35 crc kubenswrapper[4837]: I1014 13:01:35.989607 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:35Z","lastTransitionTime":"2025-10-14T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.071265 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/0.log" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.075230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.075480 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.092678 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.092786 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.092807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.093339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.093418 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.101994 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.123654 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.142842 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.159884 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.171911 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.183397 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.196122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.196187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.196200 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.196215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.196578 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.199087 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.213678 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.241447 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:34Z\\\",\\\"message\\\":\\\"I1014 13:01:34.471301 6133 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1014 13:01:34.471347 6133 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1014 13:01:34.471399 6133 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1014 13:01:34.471589 6133 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1014 13:01:34.471644 6133 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1014 13:01:34.471659 6133 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1014 13:01:34.471744 6133 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1014 13:01:34.471776 6133 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1014 13:01:34.471739 6133 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1014 13:01:34.471805 6133 handler.go:208] Removed *v1.Node event handler 2\\\\nI1014 13:01:34.471835 6133 factory.go:656] Stopping watch factory\\\\nI1014 13:01:34.471855 6133 ovnkube.go:599] Stopped ovnkube\\\\nI1014 13:01:34.471856 6133 handler.go:208] Removed *v1.Node event handler 7\\\\nI1014 13:01:34.471862 6133 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1014 13:01:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.262832 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.277687 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.298870 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.303832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.303906 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.303968 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.303997 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.304018 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.322458 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.336019 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.359358 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.406811 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.407377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.407507 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.407623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.407733 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.511547 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.511606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.511622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.511649 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.511666 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.614558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.614636 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.614662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.614724 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.614752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.717884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.717950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.717968 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.717993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.718010 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.821512 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.821600 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.821619 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.821653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.821677 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.924573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.924630 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.924642 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.924666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:36 crc kubenswrapper[4837]: I1014 13:01:36.924679 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:36Z","lastTransitionTime":"2025-10-14T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.028487 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.028562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.028579 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.028604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.028628 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.065626 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk"] Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.066555 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.068949 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.070851 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.081727 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/1.log" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.082609 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/0.log" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.086725 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6" exitCode=1 Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.086813 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.086964 4837 scope.go:117] "RemoveContainer" containerID="0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.087929 4837 scope.go:117] "RemoveContainer" containerID="cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.088046 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.088185 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.106452 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.113376 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5404624d-032f-4f37-a72e-101c5d301082-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.113500 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5404624d-032f-4f37-a72e-101c5d301082-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.113550 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5404624d-032f-4f37-a72e-101c5d301082-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.113588 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lzj\" (UniqueName: \"kubernetes.io/projected/5404624d-032f-4f37-a72e-101c5d301082-kube-api-access-k9lzj\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.124051 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.131510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.131588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.131616 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.131654 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.131679 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.139377 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.153072 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.170307 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.184549 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.200583 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.214638 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5404624d-032f-4f37-a72e-101c5d301082-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.214789 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5404624d-032f-4f37-a72e-101c5d301082-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.214847 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lzj\" (UniqueName: \"kubernetes.io/projected/5404624d-032f-4f37-a72e-101c5d301082-kube-api-access-k9lzj\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.215555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5404624d-032f-4f37-a72e-101c5d301082-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.215933 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5404624d-032f-4f37-a72e-101c5d301082-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.216639 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.216783 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5404624d-032f-4f37-a72e-101c5d301082-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.224703 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5404624d-032f-4f37-a72e-101c5d301082-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.233974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.234020 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.234037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.234062 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.234079 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.237144 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lzj\" (UniqueName: \"kubernetes.io/projected/5404624d-032f-4f37-a72e-101c5d301082-kube-api-access-k9lzj\") pod \"ovnkube-control-plane-749d76644c-bgqhk\" (UID: \"5404624d-032f-4f37-a72e-101c5d301082\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.242281 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:34Z\\\",\\\"message\\\":\\\"I1014 13:01:34.471301 6133 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1014 13:01:34.471347 6133 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1014 13:01:34.471399 6133 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1014 13:01:34.471589 6133 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1014 13:01:34.471644 6133 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1014 13:01:34.471659 6133 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1014 13:01:34.471744 6133 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1014 13:01:34.471776 6133 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1014 13:01:34.471739 6133 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1014 13:01:34.471805 6133 handler.go:208] Removed *v1.Node event handler 2\\\\nI1014 13:01:34.471835 6133 factory.go:656] Stopping watch factory\\\\nI1014 13:01:34.471855 6133 ovnkube.go:599] Stopped ovnkube\\\\nI1014 13:01:34.471856 6133 handler.go:208] Removed *v1.Node event handler 7\\\\nI1014 13:01:34.471862 6133 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1014 13:01:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.260943 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.281393 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.296882 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.334723 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.336671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.336723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.336737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.336757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.336771 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.352743 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.372816 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.385843 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.395800 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.417563 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.437811 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.442675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.442702 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.442712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.442725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.442734 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.458992 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.477909 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.497228 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.512642 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.525984 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.540816 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.546810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.546862 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.546884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.546911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.546932 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.571398 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:34Z\\\",\\\"message\\\":\\\"I1014 13:01:34.471301 6133 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1014 13:01:34.471347 6133 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1014 13:01:34.471399 6133 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1014 13:01:34.471589 6133 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1014 13:01:34.471644 6133 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1014 13:01:34.471659 6133 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1014 13:01:34.471744 6133 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1014 13:01:34.471776 6133 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1014 13:01:34.471739 6133 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1014 13:01:34.471805 6133 handler.go:208] Removed *v1.Node event handler 2\\\\nI1014 13:01:34.471835 6133 factory.go:656] Stopping watch factory\\\\nI1014 13:01:34.471855 6133 ovnkube.go:599] Stopped ovnkube\\\\nI1014 13:01:34.471856 6133 handler.go:208] Removed *v1.Node event handler 7\\\\nI1014 13:01:34.471862 6133 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1014 13:01:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.593286 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.605787 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.626521 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.645874 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.649821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.649849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.649858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.649873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.649882 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.659642 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.673055 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:37Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.721476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.721634 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:01:53.721604069 +0000 UTC m=+51.638603892 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.721739 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.721791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.721820 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.721858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.721948 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722099 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722189 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:53.722144463 +0000 UTC m=+51.639144286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722102 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722224 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722278 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:53.722266607 +0000 UTC m=+51.639266430 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722060 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722331 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:53.722318998 +0000 UTC m=+51.639318911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722004 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722358 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722371 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.722413 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:53.72240381 +0000 UTC m=+51.639403733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.751669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.751728 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.751744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.751768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.751787 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.784063 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.784124 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.784203 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.784063 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.784331 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:37 crc kubenswrapper[4837]: E1014 13:01:37.784407 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.854192 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.854251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.854269 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.854292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.854309 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.956878 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.956928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.956946 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.956966 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:37 crc kubenswrapper[4837]: I1014 13:01:37.956978 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:37Z","lastTransitionTime":"2025-10-14T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.059815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.059877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.059898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.059938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.059960 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.095245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" event={"ID":"5404624d-032f-4f37-a72e-101c5d301082","Type":"ContainerStarted","Data":"88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.095342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" event={"ID":"5404624d-032f-4f37-a72e-101c5d301082","Type":"ContainerStarted","Data":"dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.095372 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" event={"ID":"5404624d-032f-4f37-a72e-101c5d301082","Type":"ContainerStarted","Data":"e0d252499ca047e6048ca28386a445a4e9b976a909dd18c75f8af00436ed68e2"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.101410 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/1.log" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.116999 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.136901 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.162406 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.167037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.167105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.167139 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.167220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.167242 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.185008 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.205913 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.225300 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.243998 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.270277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.270323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.270335 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.270353 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.270366 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.283926 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0637262f55403b5a1b8ea2ff1a1247b927bb2e211693cbbef02acd9e4a493411\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:34Z\\\",\\\"message\\\":\\\"I1014 13:01:34.471301 6133 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1014 13:01:34.471347 6133 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1014 13:01:34.471367 6133 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1014 13:01:34.471399 6133 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1014 13:01:34.471589 6133 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1014 13:01:34.471644 6133 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1014 13:01:34.471659 6133 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1014 13:01:34.471744 6133 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1014 13:01:34.471776 6133 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1014 13:01:34.471739 6133 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1014 13:01:34.471805 6133 handler.go:208] Removed *v1.Node event handler 2\\\\nI1014 13:01:34.471835 6133 factory.go:656] Stopping watch factory\\\\nI1014 13:01:34.471855 6133 ovnkube.go:599] Stopped ovnkube\\\\nI1014 13:01:34.471856 6133 handler.go:208] Removed *v1.Node event handler 7\\\\nI1014 13:01:34.471862 6133 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1014 13:01:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.318509 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.337395 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.355577 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.369506 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.370391 4837 scope.go:117] "RemoveContainer" containerID="cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.370594 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.372936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.372969 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.372986 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.373036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.373079 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.380942 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.400910 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.416878 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.436510 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.454449 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.471546 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.476426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.476468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.476484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.476508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.476525 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.492713 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.512696 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.535657 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.553760 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.573920 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.579231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.579352 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.579373 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.579402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.579419 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.595410 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.597598 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pcpcf"] Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.598717 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.599014 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.632648 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.650729 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.678739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.681719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.681994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.682227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.682414 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.682574 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.696929 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.696959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.696970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.696984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.696994 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.698020 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.717607 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.724309 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.726528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.726595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.726612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.726642 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.726660 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.733392 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.733467 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8v2l\" (UniqueName: \"kubernetes.io/projected/7c934a24-9e12-46eb-851e-1a6925dc8909-kube-api-access-r8v2l\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.746202 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.748576 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.752070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.752099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.752111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.752127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.752140 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.766502 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.774331 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.778260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.778316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.778335 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.778361 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.778380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.784395 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.794423 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.799647 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.799709 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.799730 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.799758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.799779 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.803438 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.817495 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.817733 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.819772 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.819814 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.819831 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.819858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.819875 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.821375 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.834696 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.834742 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8v2l\" (UniqueName: \"kubernetes.io/projected/7c934a24-9e12-46eb-851e-1a6925dc8909-kube-api-access-r8v2l\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.834871 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:38 crc kubenswrapper[4837]: E1014 13:01:38.834968 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:39.334937052 +0000 UTC m=+37.251936905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.842513 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.855424 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.855733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8v2l\" (UniqueName: \"kubernetes.io/projected/7c934a24-9e12-46eb-851e-1a6925dc8909-kube-api-access-r8v2l\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.872971 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.886459 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.903479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.922751 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.922801 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.922815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.922835 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.922848 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:38Z","lastTransitionTime":"2025-10-14T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.923708 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.946786 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.966041 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:38 crc kubenswrapper[4837]: I1014 13:01:38.989088 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:38Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.006829 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.024149 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.025964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.026075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.026110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.026143 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.026203 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.063143 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.085076 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.105756 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.124415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.130139 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.130413 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.130577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.130708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.130826 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.142991 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:39Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.234510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.234576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.234596 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.234621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.234638 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.338110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.338207 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.338224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.338251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.338271 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.341766 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:39 crc kubenswrapper[4837]: E1014 13:01:39.341957 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:39 crc kubenswrapper[4837]: E1014 13:01:39.342049 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:40.342025948 +0000 UTC m=+38.259025801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.441369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.441434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.441459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.441489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.441511 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.543940 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.543989 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.544000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.544016 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.544027 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.646920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.646964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.646978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.646994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.647008 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.749577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.749631 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.749647 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.749670 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.749701 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.784233 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.784321 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.784255 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:39 crc kubenswrapper[4837]: E1014 13:01:39.784423 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:39 crc kubenswrapper[4837]: E1014 13:01:39.784598 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:39 crc kubenswrapper[4837]: E1014 13:01:39.784684 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.853396 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.853446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.853466 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.853491 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.853509 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.956435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.956755 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.956936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.957126 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:39 crc kubenswrapper[4837]: I1014 13:01:39.957380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:39Z","lastTransitionTime":"2025-10-14T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.060788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.060824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.060833 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.060846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.060855 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.164102 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.164202 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.164227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.164255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.164280 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.268128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.268232 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.268272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.268309 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.268335 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.351994 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:40 crc kubenswrapper[4837]: E1014 13:01:40.352270 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:40 crc kubenswrapper[4837]: E1014 13:01:40.352361 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:42.352338677 +0000 UTC m=+40.269338520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.371691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.371785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.371803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.371897 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.371949 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.474939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.475049 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.475074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.475102 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.475121 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.577871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.577938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.577955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.577979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.577996 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.681772 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.681847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.681869 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.681899 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.681921 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.783563 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:40 crc kubenswrapper[4837]: E1014 13:01:40.783794 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.785913 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.786229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.786405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.786623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.786815 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.889727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.889776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.889792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.889816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.889833 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.993284 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.993355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.993372 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.993395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:40 crc kubenswrapper[4837]: I1014 13:01:40.993412 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:40Z","lastTransitionTime":"2025-10-14T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.097107 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.097214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.097240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.097271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.097295 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.200572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.200644 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.200663 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.200688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.200708 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.303550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.303617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.303636 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.303660 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.303677 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.407100 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.407206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.407233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.407262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.407285 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.510275 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.510352 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.510376 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.510405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.510433 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.613807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.613863 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.613882 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.613907 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.613925 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.717245 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.717317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.717333 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.717358 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.717376 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.783826 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.783894 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.783847 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:41 crc kubenswrapper[4837]: E1014 13:01:41.784076 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:41 crc kubenswrapper[4837]: E1014 13:01:41.784255 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:41 crc kubenswrapper[4837]: E1014 13:01:41.784362 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.819886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.819988 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.820008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.820033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.820058 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.922596 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.922691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.922710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.922734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:41 crc kubenswrapper[4837]: I1014 13:01:41.922752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:41Z","lastTransitionTime":"2025-10-14T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.025111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.025149 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.025187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.025215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.025224 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.127020 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.127082 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.127096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.127125 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.127134 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.230200 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.230517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.230609 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.230694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.230783 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.334140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.334259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.334289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.334319 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.334341 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.399229 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:42 crc kubenswrapper[4837]: E1014 13:01:42.399389 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:42 crc kubenswrapper[4837]: E1014 13:01:42.399485 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:46.399461494 +0000 UTC m=+44.316461337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.437312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.437387 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.437405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.437431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.437447 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.540219 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.540286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.540323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.540350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.540367 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.642957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.643272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.643372 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.643637 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.643712 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.746292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.746373 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.746393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.746418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.746437 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.783614 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:42 crc kubenswrapper[4837]: E1014 13:01:42.783822 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.803122 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.814016 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.844931 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.848578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.848697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.848720 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.848745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.848763 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.869195 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.886692 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.905732 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.921143 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.937137 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.950433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.951548 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.951610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.951635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.951664 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.951688 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:42Z","lastTransitionTime":"2025-10-14T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.966726 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:42 crc kubenswrapper[4837]: I1014 13:01:42.982397 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:42Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.004678 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:43Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.023741 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:43Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.046078 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:43Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.054656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.054774 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.054799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.054829 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.054851 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.063322 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:43Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.081345 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:43Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.107063 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:43Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.158020 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.158091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.158103 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.158121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.158132 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.260881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.261298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.261444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.261579 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.261697 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.364679 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.364738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.364756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.364779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.364798 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.467933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.467994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.468010 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.468034 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.468051 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.570758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.570818 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.570837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.570861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.570880 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.674602 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.674661 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.674680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.674718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.674757 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.778340 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.778421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.778437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.778462 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.778479 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.783658 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.783723 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:43 crc kubenswrapper[4837]: E1014 13:01:43.783791 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.783664 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:43 crc kubenswrapper[4837]: E1014 13:01:43.783961 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:43 crc kubenswrapper[4837]: E1014 13:01:43.784083 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.881783 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.881848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.881866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.881918 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.881937 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.984247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.984293 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.984310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.984330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:43 crc kubenswrapper[4837]: I1014 13:01:43.984341 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:43Z","lastTransitionTime":"2025-10-14T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.087239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.087273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.087282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.087296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.087306 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.190033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.190095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.190111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.190136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.190189 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.293747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.293865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.293888 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.293917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.293940 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.397103 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.397209 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.397226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.397254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.397271 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.499652 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.500197 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.500280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.500349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.500416 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.602984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.603290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.603376 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.603465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.603544 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.707135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.707223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.707249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.707275 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.707294 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.784312 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:44 crc kubenswrapper[4837]: E1014 13:01:44.784866 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.809600 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.809665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.809678 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.809699 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.809714 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.912595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.912679 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.912697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.912726 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:44 crc kubenswrapper[4837]: I1014 13:01:44.912745 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:44Z","lastTransitionTime":"2025-10-14T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.016108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.016199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.016217 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.016242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.016260 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.119187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.119264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.119277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.119306 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.119321 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.223221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.223305 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.223472 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.223510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.223530 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.327060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.327191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.327216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.327246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.327267 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.430972 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.431025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.431037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.431056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.431067 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.534789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.534872 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.534899 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.534933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.534956 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.638226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.638299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.638318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.638352 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.638376 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.741954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.742026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.742046 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.742070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.742115 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.783869 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.783909 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.783870 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:45 crc kubenswrapper[4837]: E1014 13:01:45.784441 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:45 crc kubenswrapper[4837]: E1014 13:01:45.784658 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:45 crc kubenswrapper[4837]: E1014 13:01:45.784775 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.845251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.845328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.845339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.845357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.845368 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.948149 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.948250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.948268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.948291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:45 crc kubenswrapper[4837]: I1014 13:01:45.948311 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:45Z","lastTransitionTime":"2025-10-14T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.051905 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.051949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.051961 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.051981 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.051995 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.154252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.154299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.154313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.154336 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.154353 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.257885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.257934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.257950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.257969 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.257981 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.360408 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.360469 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.360488 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.360512 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.360530 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.449082 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:46 crc kubenswrapper[4837]: E1014 13:01:46.449336 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:46 crc kubenswrapper[4837]: E1014 13:01:46.449458 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:01:54.449424673 +0000 UTC m=+52.366424526 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.463847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.463906 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.463923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.463947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.463966 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.566807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.566877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.566899 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.566933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.566954 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.669594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.669662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.669685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.669716 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.669738 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.772873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.772924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.772941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.772964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.772981 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.784111 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:46 crc kubenswrapper[4837]: E1014 13:01:46.784358 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.876908 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.877012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.877105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.877141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.877221 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.979769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.979828 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.979848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.979871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:46 crc kubenswrapper[4837]: I1014 13:01:46.979889 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:46Z","lastTransitionTime":"2025-10-14T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.082974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.083046 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.083064 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.083088 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.083106 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.185328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.185393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.185409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.185432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.185448 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.288783 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.288853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.288870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.288901 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.288921 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.392023 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.392083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.392099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.392124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.392143 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.495694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.495766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.495794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.495827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.495848 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.599984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.600934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.601091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.601280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.601432 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.704300 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.704644 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.704796 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.704944 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.705073 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.784401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.784441 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.784831 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:47 crc kubenswrapper[4837]: E1014 13:01:47.785053 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:47 crc kubenswrapper[4837]: E1014 13:01:47.785383 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:47 crc kubenswrapper[4837]: E1014 13:01:47.785567 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.809007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.809105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.809129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.809232 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.809257 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.912325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.912389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.912407 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.912459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:47 crc kubenswrapper[4837]: I1014 13:01:47.912478 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:47Z","lastTransitionTime":"2025-10-14T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.015679 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.015754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.015775 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.015805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.015828 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.119498 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.119559 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.119578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.119601 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.119617 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.223041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.223118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.223133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.223154 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.223209 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.349292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.349348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.349367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.349401 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.349427 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.452270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.452338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.452358 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.452384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.452403 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.554778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.555380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.555600 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.555829 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.556034 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.659253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.659336 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.659360 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.659389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.659409 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.762337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.762402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.762420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.762443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.762459 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.784625 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.784840 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.849940 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.849988 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.850004 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.850018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.850028 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.870706 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.875611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.875658 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.875670 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.875689 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.875704 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.896299 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.901490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.901543 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.901565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.901588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.901606 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.922719 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.927305 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.927377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.927402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.927432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.927454 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.950840 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.954980 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.955033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.955055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.955084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.955106 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.971029 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:48Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:48 crc kubenswrapper[4837]: E1014 13:01:48.971316 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.973214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.973298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.973322 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.973357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:48 crc kubenswrapper[4837]: I1014 13:01:48.973381 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:48Z","lastTransitionTime":"2025-10-14T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.076929 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.076994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.077012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.077040 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.077059 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.180606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.180687 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.180711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.180742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.180764 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.291637 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.291689 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.291707 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.291732 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.291749 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.395229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.395629 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.395786 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.395902 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.396010 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.518880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.518935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.518959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.518991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.519015 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.622140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.622262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.622296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.622331 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.622355 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.725291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.725351 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.725373 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.725398 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.725416 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.783525 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.783603 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.783688 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:49 crc kubenswrapper[4837]: E1014 13:01:49.783877 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:49 crc kubenswrapper[4837]: E1014 13:01:49.784016 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:49 crc kubenswrapper[4837]: E1014 13:01:49.784200 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.829895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.829988 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.830018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.830051 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.830073 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.933011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.933055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.933066 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.933083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:49 crc kubenswrapper[4837]: I1014 13:01:49.933095 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:49Z","lastTransitionTime":"2025-10-14T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.036095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.036238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.036274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.036304 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.036328 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.138041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.138095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.138105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.138120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.138132 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.240248 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.240307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.240326 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.240349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.240366 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.342411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.342475 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.342497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.342526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.342551 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.444816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.444886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.444905 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.444929 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.444947 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.548181 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.548258 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.548272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.548288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.548300 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.650680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.650819 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.650839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.650867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.650886 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.754182 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.754257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.754274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.754298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.754314 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.783864 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:50 crc kubenswrapper[4837]: E1014 13:01:50.784100 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.785425 4837 scope.go:117] "RemoveContainer" containerID="cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.859508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.859572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.859592 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.859619 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.859637 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.963242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.963363 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.963387 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.963414 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:50 crc kubenswrapper[4837]: I1014 13:01:50.963432 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:50Z","lastTransitionTime":"2025-10-14T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.066106 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.066141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.066152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.066195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.066209 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.152237 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/1.log" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.156402 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.156993 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.176070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.176117 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.176128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.176145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.176170 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.203218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.224559 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.252103 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.268339 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.278081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.278118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.278129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.278146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.278209 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.289243 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.317285 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.332632 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.355423 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.366818 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.377422 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.379845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.379886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.379899 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.379917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.379928 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.386121 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.395611 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.411392 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.425113 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.437823 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.451234 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.470585 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.482083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.482122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.482131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.482146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.482169 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.584977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.585027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.585037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.585050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.585059 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.687505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.687627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.687650 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.687684 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.687708 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.784114 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.784252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:51 crc kubenswrapper[4837]: E1014 13:01:51.784338 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.784271 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:51 crc kubenswrapper[4837]: E1014 13:01:51.784451 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:51 crc kubenswrapper[4837]: E1014 13:01:51.784600 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.790880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.790920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.790936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.790959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.790976 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.893589 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.893637 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.893648 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.893665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.893679 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.996648 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.996706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.996723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.996745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:51 crc kubenswrapper[4837]: I1014 13:01:51.996762 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:51Z","lastTransitionTime":"2025-10-14T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.099125 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.099250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.099277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.099303 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.099321 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.135433 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.148544 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.160227 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.162656 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/2.log" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.163662 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/1.log" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.167574 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c" exitCode=1 Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.167628 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.167693 4837 scope.go:117] "RemoveContainer" containerID="cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.168980 4837 scope.go:117] "RemoveContainer" containerID="b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c" Oct 14 13:01:52 crc kubenswrapper[4837]: E1014 13:01:52.169452 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.181721 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.199116 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.201763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.202050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.202288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.202470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.202639 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.217529 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.233960 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.252684 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.271984 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.302915 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.305597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.305643 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.305661 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.305693 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.305715 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.337257 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.356551 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.375824 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.398961 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.409426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.409497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.409517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.409542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.409559 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.417950 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.433555 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.450082 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.468596 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.485744 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.500656 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.512239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.512322 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.512346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.512381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.512406 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.517514 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.536055 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.556295 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.573050 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.587068 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.603498 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.615685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.615738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.615756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.615781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.615799 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.632814 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.646347 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.662122 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.693983 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.711858 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.719018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.719085 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.719112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.719145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.719213 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.732690 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.755142 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.775146 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.783620 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:52 crc kubenswrapper[4837]: E1014 13:01:52.783803 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.794452 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.811576 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.821882 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.821912 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.821923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.821939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.821952 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.827759 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.843864 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.859192 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.879408 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.896591 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.911926 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.925235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.925279 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.925295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.925317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.925330 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:52Z","lastTransitionTime":"2025-10-14T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.932391 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.951295 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.973358 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc7dcfba13bb947c5043f6fc4d3df820ed859dd853e5bad9544c6091930f30d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:36Z\\\",\\\"message\\\":\\\"ault_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1014 13:01:36.119866 6274 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:36Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:36.119835 6274 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:52 crc kubenswrapper[4837]: I1014 13:01:52.997882 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:52Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.017490 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.027990 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.028260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.028374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.028468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.028576 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.036388 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.064015 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.082141 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.097108 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.116637 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.130371 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.130811 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.130852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.130864 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.130880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.130893 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.143654 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.155296 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.172007 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/2.log" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.176108 4837 scope.go:117] "RemoveContainer" containerID="b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c" Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.176446 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.196375 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.213293 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.225388 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.233501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.233552 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.233570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.233594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.233612 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.241034 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.251613 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.264135 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.278002 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.305379 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.334743 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.337252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.337687 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.337848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.338022 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.338226 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.354739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.374572 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.398701 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.416837 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.433852 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.441926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.442218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.442417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.442572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.442728 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.451750 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.466732 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.476336 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.486568 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:53Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.545229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.545571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.545714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.545910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.546098 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.648542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.648820 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.649014 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.649225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.649443 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.752654 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.752705 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.752721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.752744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.752762 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.758247 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.758346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.758394 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.758431 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.758469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758579 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:02:25.758558193 +0000 UTC m=+83.675558046 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758597 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758630 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758650 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758707 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758751 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758794 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758817 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758830 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758719 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:02:25.758701767 +0000 UTC m=+83.675701610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758902 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:02:25.758881262 +0000 UTC m=+83.675881255 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758932 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:02:25.758917333 +0000 UTC m=+83.675917196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.758960 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:02:25.758944723 +0000 UTC m=+83.675944566 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.784554 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.784677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.784674 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.784807 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.784919 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:53 crc kubenswrapper[4837]: E1014 13:01:53.785076 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.855700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.855825 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.855846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.855870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.855887 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.959247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.959314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.959337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.959367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:53 crc kubenswrapper[4837]: I1014 13:01:53.959387 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:53Z","lastTransitionTime":"2025-10-14T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.062825 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.062894 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.062913 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.062942 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.062960 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.165553 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.165617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.165634 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.165659 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.165678 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.268830 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.268882 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.268897 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.268918 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.268930 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.371637 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.371704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.371721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.371745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.371763 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.465927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:54 crc kubenswrapper[4837]: E1014 13:01:54.466149 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:54 crc kubenswrapper[4837]: E1014 13:01:54.466317 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:02:10.466282292 +0000 UTC m=+68.383282155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.474485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.474541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.474560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.474582 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.474635 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.577015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.577061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.577071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.577084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.577092 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.679131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.679221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.679238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.679262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.679278 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.781730 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.781769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.781781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.781875 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.781914 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.784339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:54 crc kubenswrapper[4837]: E1014 13:01:54.784552 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.884057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.884119 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.884137 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.884185 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.884207 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.987897 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.988267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.988279 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.988298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:54 crc kubenswrapper[4837]: I1014 13:01:54.988310 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:54Z","lastTransitionTime":"2025-10-14T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.091701 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.091769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.091785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.091809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.091826 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.194855 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.194890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.194899 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.194913 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.194923 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.298832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.298931 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.298947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.298973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.298991 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.402340 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.402425 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.402450 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.402486 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.402513 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.505615 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.505665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.505681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.505703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.505720 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.611666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.611709 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.611721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.611737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.611747 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.714923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.714982 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.714999 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.715024 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.715040 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.783384 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.783485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.783554 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:55 crc kubenswrapper[4837]: E1014 13:01:55.783549 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:55 crc kubenswrapper[4837]: E1014 13:01:55.783691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:55 crc kubenswrapper[4837]: E1014 13:01:55.783806 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.818311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.818390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.818415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.818448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.818469 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.922058 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.922146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.922208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.922241 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:55 crc kubenswrapper[4837]: I1014 13:01:55.922267 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:55Z","lastTransitionTime":"2025-10-14T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.026316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.026389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.026411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.026446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.026468 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.129759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.129852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.129871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.129894 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.129911 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.232547 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.232615 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.232633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.232659 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.232677 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.336572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.336635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.336652 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.336675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.336692 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.441522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.441632 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.441680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.441708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.441725 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.544661 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.545114 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.545534 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.545656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.545970 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.649747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.649788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.649798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.649813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.649823 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.752202 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.752251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.752263 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.752279 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.752292 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.783813 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:56 crc kubenswrapper[4837]: E1014 13:01:56.783961 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.854332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.854370 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.854395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.854411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.854421 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.956739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.956800 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.956817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.956841 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:56 crc kubenswrapper[4837]: I1014 13:01:56.956857 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:56Z","lastTransitionTime":"2025-10-14T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.059832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.059964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.059986 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.060011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.060032 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.162745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.162800 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.162816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.162839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.162855 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.266068 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.266129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.266151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.266230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.266256 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.369412 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.369467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.369483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.369537 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.369557 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.472714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.472777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.472794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.472817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.472875 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.575518 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.575604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.575635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.575663 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.575685 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.678489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.678603 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.678628 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.678866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.678887 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.782398 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.782465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.782487 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.782517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.782538 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.783599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.783654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:57 crc kubenswrapper[4837]: E1014 13:01:57.783741 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.783606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:57 crc kubenswrapper[4837]: E1014 13:01:57.783877 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:57 crc kubenswrapper[4837]: E1014 13:01:57.784020 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.885294 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.885353 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.885364 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.885382 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.885396 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.988082 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.988112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.988120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.988132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:57 crc kubenswrapper[4837]: I1014 13:01:57.988140 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:57Z","lastTransitionTime":"2025-10-14T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.091111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.091205 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.091223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.091246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.091262 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.194314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.194367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.194379 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.194396 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.194408 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.302975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.303026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.303047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.303074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.303094 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.406299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.406364 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.406383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.406413 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.406431 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.509189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.509262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.509289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.509323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.509347 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.612586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.612759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.612788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.612869 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.612895 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.715899 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.715953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.715969 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.716186 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.716204 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.784128 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:01:58 crc kubenswrapper[4837]: E1014 13:01:58.784318 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.819061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.819121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.819256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.819366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.819457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.921962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.922078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.922104 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.922132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:58 crc kubenswrapper[4837]: I1014 13:01:58.922149 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:58Z","lastTransitionTime":"2025-10-14T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.024692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.024765 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.024790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.024822 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.024842 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.116209 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.116278 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.116296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.116322 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.116339 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.140824 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.146256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.146357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.146376 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.146439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.146457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.166403 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.171403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.171466 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.171485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.171509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.171526 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.191206 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.195802 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.195908 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.195926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.195953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.195970 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.217414 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.222305 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.222400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.222420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.222443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.222463 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.243768 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:59Z is after 2025-08-24T17:21:41Z" Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.244005 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.245797 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.245855 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.245876 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.245902 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.245920 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.348745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.348798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.348816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.348839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.348855 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.452000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.452079 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.452098 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.452128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.452146 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.555287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.555349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.555366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.555390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.555409 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.658844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.658909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.658935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.658986 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.659008 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.761833 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.761905 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.761927 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.761953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.761970 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.784334 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.784387 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.784339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.784515 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.784825 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:01:59 crc kubenswrapper[4837]: E1014 13:01:59.784719 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.865055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.865114 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.865132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.865186 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.865216 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.969034 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.969093 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.969110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.969135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:01:59 crc kubenswrapper[4837]: I1014 13:01:59.969151 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:01:59Z","lastTransitionTime":"2025-10-14T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.071925 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.071969 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.071980 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.071996 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.072007 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.175441 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.175517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.175530 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.175550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.175563 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.278840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.278910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.278928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.278954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.278972 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.381843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.381917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.381937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.381963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.381979 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.485065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.485133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.485197 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.485228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.485251 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.587820 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.587897 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.587920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.587949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.587976 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.691024 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.691084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.691101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.691124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.691141 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.784475 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:00 crc kubenswrapper[4837]: E1014 13:02:00.784647 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.795948 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.796050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.796070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.796096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.796113 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.899190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.899257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.899273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.899297 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:00 crc kubenswrapper[4837]: I1014 13:02:00.899314 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:00Z","lastTransitionTime":"2025-10-14T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.002090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.002141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.002193 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.002216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.002234 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.104516 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.104571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.104588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.104612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.104628 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.208343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.208408 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.208429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.208461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.208482 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.311888 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.311947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.311964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.311988 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.312006 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.415203 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.415256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.415273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.415296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.415313 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.518434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.518491 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.518508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.518532 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.518552 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.622109 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.622242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.622267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.622296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.622317 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.725458 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.725625 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.725649 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.725675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.725693 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.783833 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.783903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.783853 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:01 crc kubenswrapper[4837]: E1014 13:02:01.784040 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:01 crc kubenswrapper[4837]: E1014 13:02:01.784192 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:01 crc kubenswrapper[4837]: E1014 13:02:01.784302 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.828394 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.828439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.828455 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.828480 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.828498 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.932007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.932068 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.932081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.932127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:01 crc kubenswrapper[4837]: I1014 13:02:01.932147 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:01Z","lastTransitionTime":"2025-10-14T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.035614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.035688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.035711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.035741 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.035765 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.138128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.138239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.138261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.138289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.138307 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.240643 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.240713 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.240742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.240770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.240790 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.343967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.344041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.344061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.344087 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.344109 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.446742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.446834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.446862 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.446892 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.446917 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.550417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.550475 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.550495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.550517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.550536 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.653879 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.653940 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.653963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.653992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.654013 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.757795 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.757885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.757910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.757939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.757959 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.783937 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:02 crc kubenswrapper[4837]: E1014 13:02:02.784480 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.807046 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.827224 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.850855 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.860873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.861091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.861470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.861626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.861769 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.873000 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.889346 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.907619 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.940036 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.954016 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.970746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.971268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.971497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.972221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.972621 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:02Z","lastTransitionTime":"2025-10-14T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.974557 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:02 crc kubenswrapper[4837]: I1014 13:02:02.988890 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:02Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.006786 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.026807 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.041608 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.061443 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.076125 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.076578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.077045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.077314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.077519 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.077410 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.095323 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.110862 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.137863 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:03Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.180307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.181735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.181861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.181977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.182095 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.284508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.284744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.284809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.284880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.284949 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.387474 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.387847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.387978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.388086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.388247 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.490986 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.491057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.491080 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.491111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.491135 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.594226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.594475 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.594539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.594599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.594659 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.697236 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.697291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.697307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.697329 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.697345 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.784564 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.784637 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:03 crc kubenswrapper[4837]: E1014 13:02:03.784794 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:03 crc kubenswrapper[4837]: E1014 13:02:03.785006 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.785494 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:03 crc kubenswrapper[4837]: E1014 13:02:03.785650 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.786141 4837 scope.go:117] "RemoveContainer" containerID="b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c" Oct 14 13:02:03 crc kubenswrapper[4837]: E1014 13:02:03.786433 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.800250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.800282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.800295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.800313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.800327 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.903858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.903966 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.903995 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.904034 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:03 crc kubenswrapper[4837]: I1014 13:02:03.904063 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:03Z","lastTransitionTime":"2025-10-14T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.007191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.007257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.007285 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.007316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.007347 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.110572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.110640 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.110662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.110692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.110715 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.213613 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.213684 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.213701 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.213728 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.213745 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.320945 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.321032 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.321056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.321206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.321238 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.424953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.425021 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.425044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.425074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.425097 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.528867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.528914 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.529131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.529199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.529218 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.631967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.631995 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.632003 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.632015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.632023 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.735108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.735135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.735143 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.735171 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.735180 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.784674 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:04 crc kubenswrapper[4837]: E1014 13:02:04.784815 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.837938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.837984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.837996 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.838012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.838025 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.941680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.941898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.941937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.941954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:04 crc kubenswrapper[4837]: I1014 13:02:04.941968 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:04Z","lastTransitionTime":"2025-10-14T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.044446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.044508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.044531 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.044555 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.044572 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.147598 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.147661 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.147678 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.147703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.147720 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.250388 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.250445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.250463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.250545 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.250564 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.353274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.353323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.353340 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.353368 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.353385 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.456968 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.457128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.457191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.457222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.457243 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.561016 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.561072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.561089 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.561116 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.561134 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.665005 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.665088 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.665113 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.665142 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.665207 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.767882 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.767924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.767934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.767951 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.767961 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.783432 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.783435 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.783491 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:05 crc kubenswrapper[4837]: E1014 13:02:05.783566 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:05 crc kubenswrapper[4837]: E1014 13:02:05.783738 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:05 crc kubenswrapper[4837]: E1014 13:02:05.784207 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.870573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.870641 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.870659 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.870684 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.870701 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.974550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.974633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.974651 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.974675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:05 crc kubenswrapper[4837]: I1014 13:02:05.974692 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:05Z","lastTransitionTime":"2025-10-14T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.077936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.077982 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.077994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.078011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.078024 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.180713 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.180766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.180786 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.180810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.180828 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.283368 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.283409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.283419 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.283433 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.283445 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.385652 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.385718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.385743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.385771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.385792 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.488355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.488417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.488439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.488468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.488489 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.591390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.591437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.591450 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.591468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.591480 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.694317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.694393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.694418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.694446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.694469 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.784301 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:06 crc kubenswrapper[4837]: E1014 13:02:06.784503 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.796967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.797025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.797045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.797067 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.797085 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.899553 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.899606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.899617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.899633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:06 crc kubenswrapper[4837]: I1014 13:02:06.899648 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:06Z","lastTransitionTime":"2025-10-14T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.001791 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.001828 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.001839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.001856 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.001867 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.104709 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.105340 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.105471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.105604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.105704 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.208699 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.208943 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.209044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.209136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.209254 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.313078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.313147 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.313195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.313221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.313243 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.415684 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.415736 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.415748 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.415766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.415779 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.518393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.518427 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.518435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.518448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.518457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.621316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.621362 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.621374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.621392 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.621406 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.723540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.723926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.724056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.724261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.724505 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.783543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:07 crc kubenswrapper[4837]: E1014 13:02:07.783677 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.783559 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:07 crc kubenswrapper[4837]: E1014 13:02:07.783889 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.784145 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:07 crc kubenswrapper[4837]: E1014 13:02:07.784553 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.827332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.827609 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.827697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.827815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.827897 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.930186 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.930233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.930249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.930282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:07 crc kubenswrapper[4837]: I1014 13:02:07.930299 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:07Z","lastTransitionTime":"2025-10-14T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.033015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.033077 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.033095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.033120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.033138 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.135508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.135566 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.135583 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.135610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.135632 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.237974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.238045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.238063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.238092 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.238110 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.341048 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.341093 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.341106 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.341131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.341147 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.444302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.444703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.444859 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.444990 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.445116 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.548523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.549421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.549564 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.549655 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.549734 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.653374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.653710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.653778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.653843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.653912 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.756734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.757068 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.757288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.757452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.757624 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.784440 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:08 crc kubenswrapper[4837]: E1014 13:02:08.784571 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.860286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.860346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.860355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.860369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.860379 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.962227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.962272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.962285 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.962301 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:08 crc kubenswrapper[4837]: I1014 13:02:08.962312 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:08Z","lastTransitionTime":"2025-10-14T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.065148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.065263 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.065280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.065305 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.065325 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.167450 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.167496 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.167508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.167522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.167531 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.269618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.269690 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.269742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.269776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.269797 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.354728 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.354800 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.354821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.354845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.354861 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.370340 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.373966 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.374026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.374044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.374067 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.374083 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.388801 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.393481 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.393550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.393569 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.393594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.393611 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.408963 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.412546 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.412601 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.412620 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.412644 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.412665 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.423992 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.427318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.427366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.427383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.427403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.427419 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.438026 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:09Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.438269 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.439893 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.439954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.439973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.439998 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.440015 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.543049 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.543116 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.543133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.543189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.543217 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.646084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.646133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.646150 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.646211 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.646230 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.748356 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.748568 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.748577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.748590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.748599 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.784139 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.784231 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.784292 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.784152 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.784353 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:09 crc kubenswrapper[4837]: E1014 13:02:09.784476 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.851058 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.851104 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.851117 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.851132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.851143 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.953175 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.953209 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.953218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.953233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:09 crc kubenswrapper[4837]: I1014 13:02:09.953241 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:09Z","lastTransitionTime":"2025-10-14T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.055865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.055937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.055959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.055988 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.056009 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.158871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.158935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.158954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.158978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.158996 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.261493 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.261533 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.261542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.261557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.261984 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.367759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.368544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.368694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.368798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.368885 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.472026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.472079 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.472091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.472118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.472132 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.567397 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:10 crc kubenswrapper[4837]: E1014 13:02:10.567627 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:02:10 crc kubenswrapper[4837]: E1014 13:02:10.567729 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:02:42.567702308 +0000 UTC m=+100.484702161 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.574911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.574933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.574941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.574954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.574963 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.677893 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.677960 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.677979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.678002 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.678020 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.781318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.781371 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.781383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.781400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.781411 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.784044 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:10 crc kubenswrapper[4837]: E1014 13:02:10.784176 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.883934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.884001 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.884034 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.884071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.884085 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.987257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.987313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.987323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.987345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:10 crc kubenswrapper[4837]: I1014 13:02:10.987359 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:10Z","lastTransitionTime":"2025-10-14T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.089730 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.089798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.089822 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.089853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.089876 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.192663 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.192743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.192776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.192805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.192827 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.295900 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.295973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.296006 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.296036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.296058 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.398789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.398854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.398877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.398905 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.398925 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.501955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.502007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.502027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.502049 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.502065 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.604563 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.604624 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.604642 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.604666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.604683 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.708036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.708090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.708108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.708130 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.708147 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.784435 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.784524 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:11 crc kubenswrapper[4837]: E1014 13:02:11.784610 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.784518 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:11 crc kubenswrapper[4837]: E1014 13:02:11.784732 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:11 crc kubenswrapper[4837]: E1014 13:02:11.784979 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.810122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.810187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.810200 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.810216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.810228 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.912817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.912916 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.912936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.913354 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:11 crc kubenswrapper[4837]: I1014 13:02:11.913565 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:11Z","lastTransitionTime":"2025-10-14T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.016857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.016923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.016940 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.016967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.016984 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.120301 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.120361 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.120378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.120403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.120420 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.223118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.223205 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.223223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.223249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.223267 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.243081 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/0.log" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.243146 4837 generic.go:334] "Generic (PLEG): container finished" podID="01492025-d672-4746-af22-53fa41a3f612" containerID="ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc" exitCode=1 Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.243219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerDied","Data":"ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.243726 4837 scope.go:117] "RemoveContainer" containerID="ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.257505 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.269100 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.283620 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.296923 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.312819 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.328769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.328812 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.328824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.328844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.328857 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.332063 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.343019 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.361111 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.374385 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.389116 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.405999 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.425882 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.431051 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.431091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.431106 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.431126 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.431141 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.442949 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.458546 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.475337 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.488062 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.501483 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.510353 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.533604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.533633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.533641 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.533656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.533664 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.636486 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.636544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.636565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.636588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.636607 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.738697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.738725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.738734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.738747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.738756 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.784258 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:12 crc kubenswrapper[4837]: E1014 13:02:12.784399 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.801327 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.817490 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.840741 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.843405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.843464 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.843477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.843494 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.843507 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.855812 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.866318 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.877895 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.897645 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.908927 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.921488 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.932105 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.945621 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.946706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.946745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.946761 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.946783 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.946799 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:12Z","lastTransitionTime":"2025-10-14T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.957456 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.969480 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.980455 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:12 crc kubenswrapper[4837]: I1014 13:02:12.992352 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:12Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.003141 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.026477 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.049220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.049263 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.049276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.049292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.049304 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.074565 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.151823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.151845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.151853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.151866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.151874 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.249023 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/0.log" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.249081 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerStarted","Data":"01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.255335 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.255363 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.255374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.255388 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.255400 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.268795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.283433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.301238 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.312976 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.325234 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.337356 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.355252 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.358328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.358380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.358395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.358414 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.358429 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.383871 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.409617 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.427629 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.444502 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.461432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.461467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.461477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.461495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.461506 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.512587 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.533262 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.549295 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.564374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.564428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.564449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.564512 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.564533 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.565560 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.581577 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.594347 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.608440 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:13Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.666618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.666656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.666667 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.666683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.666695 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.768129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.768202 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.768218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.768232 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.768243 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.783918 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.783957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.783961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:13 crc kubenswrapper[4837]: E1014 13:02:13.784010 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:13 crc kubenswrapper[4837]: E1014 13:02:13.784224 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:13 crc kubenswrapper[4837]: E1014 13:02:13.784290 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.870452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.870503 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.870520 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.870542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.870559 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.972582 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.972620 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.972631 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.972645 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:13 crc kubenswrapper[4837]: I1014 13:02:13.972658 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:13Z","lastTransitionTime":"2025-10-14T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.075025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.075051 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.075059 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.075072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.075081 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.177479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.177507 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.177515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.177527 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.177537 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.279723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.279756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.279767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.279782 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.279793 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.382197 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.382238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.382266 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.382284 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.382296 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.487706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.487771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.487792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.487815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.487840 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.591136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.591196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.591206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.591222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.591233 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.692816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.692875 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.692941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.692954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.692962 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.784007 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:14 crc kubenswrapper[4837]: E1014 13:02:14.784288 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.795426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.795483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.795493 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.795503 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.795511 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.898583 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.898622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.898637 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.898655 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:14 crc kubenswrapper[4837]: I1014 13:02:14.898667 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:14Z","lastTransitionTime":"2025-10-14T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.001477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.001536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.001555 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.001579 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.001598 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.105039 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.105085 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.105097 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.105113 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.105125 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.208256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.208332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.208355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.208384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.208407 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.311019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.311076 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.311093 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.311118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.311136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.414026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.414090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.414107 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.414132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.414153 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.517144 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.517234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.517252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.517276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.517293 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.619671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.619720 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.619734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.619755 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.619771 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.722907 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.722959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.722976 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.723001 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.723017 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.784858 4837 scope.go:117] "RemoveContainer" containerID="b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.785361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:15 crc kubenswrapper[4837]: E1014 13:02:15.785470 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.785471 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.785511 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:15 crc kubenswrapper[4837]: E1014 13:02:15.785590 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:15 crc kubenswrapper[4837]: E1014 13:02:15.785724 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.825310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.825370 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.825386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.825413 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.825430 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.928083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.928146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.928180 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.928199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:15 crc kubenswrapper[4837]: I1014 13:02:15.928212 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:15Z","lastTransitionTime":"2025-10-14T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.030879 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.030941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.030959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.030985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.031007 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.133942 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.133985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.133996 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.134011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.134024 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.236485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.236535 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.236552 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.236575 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.236592 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.261529 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/2.log" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.264610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.266310 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.287079 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.306795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.334175 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.339039 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.339095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.339112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.339135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.339151 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.349870 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.367473 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.375904 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.384348 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.392820 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.402926 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.414330 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.427843 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.441346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.441388 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.441400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.441416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.441427 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.448564 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.463113 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.475652 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.488651 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.503714 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.521125 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.543820 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.543854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.543866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.543880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.543888 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.544657 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:16Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.645761 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.645818 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.645836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.645861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.645878 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.766248 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.766280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.766289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.766303 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.766312 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.784128 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:16 crc kubenswrapper[4837]: E1014 13:02:16.784259 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.869267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.869477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.869597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.869704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.869820 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.971911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.971970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.971986 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.972009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:16 crc kubenswrapper[4837]: I1014 13:02:16.972026 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:16Z","lastTransitionTime":"2025-10-14T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.075586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.075646 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.075662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.075685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.075702 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.178702 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.178757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.178778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.178804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.178824 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.270832 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/3.log" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.271973 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/2.log" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.275974 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" exitCode=1 Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.276047 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.276128 4837 scope.go:117] "RemoveContainer" containerID="b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.277441 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:02:17 crc kubenswrapper[4837]: E1014 13:02:17.277785 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.282892 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.282939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.282954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.282977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.282995 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.307540 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.328053 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.347806 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.365615 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.383505 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.385471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.385541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.385560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.385586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.385607 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.398978 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.416952 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.442619 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9c6bc8b0b8a331891a743e11be6bb8b806510dea3e498529420d487e83a167c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:01:51Z\\\",\\\"message\\\":\\\"calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:01:51Z is after 2025-08-24T17:21:41Z]\\\\nI1014 13:01:51.774511 6476 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1014 13:01:51.775751 6476 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1014 13:01:51.774191 6476 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:16Z\\\",\\\"message\\\":\\\"perator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-autoscaler-operator openshift-machine-api b8e0040a-0eca-4299-ac4a-f26a24879998 4394 0 2025-02-23 05:12:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-autoscaler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-autoscaler-operator-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0078b6fdb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:metrics,Protocol:TCP,Port:9192,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: cluster-autoscaler-operator,},ClusterIP:10.217.5.245,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamily\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:02:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.470618 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.488769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.488829 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.488847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.488868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.488883 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.488771 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.506315 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.526458 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.549706 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.567611 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.582775 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.593415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.593469 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.593486 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.593509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.593524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.598111 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.610367 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.624879 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:17Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.696033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.696091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.696109 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.696134 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.696151 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.783466 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.783507 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.783606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:17 crc kubenswrapper[4837]: E1014 13:02:17.783795 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:17 crc kubenswrapper[4837]: E1014 13:02:17.783951 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:17 crc kubenswrapper[4837]: E1014 13:02:17.784106 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.798809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.798854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.798872 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.798892 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.798908 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.901595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.901642 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.901659 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.901679 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:17 crc kubenswrapper[4837]: I1014 13:02:17.901694 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:17Z","lastTransitionTime":"2025-10-14T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.004776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.005065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.005134 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.005190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.005208 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.108525 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.108590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.108609 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.108635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.108657 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.211187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.211255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.211278 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.211307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.211330 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.282399 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/3.log" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.287050 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:02:18 crc kubenswrapper[4837]: E1014 13:02:18.287227 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.310014 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.315044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.315468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.315705 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.315943 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.316147 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.332792 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.349299 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.367943 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.399127 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.418473 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.420047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.420086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.420101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.420125 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.420142 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.437606 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.454810 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.471545 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.485427 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.502306 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.517892 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.523463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.523505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.523515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.523531 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.523541 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.532693 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.547306 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.566577 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.586755 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.606227 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.626416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.626514 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.626533 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.626605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.626632 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.635493 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:16Z\\\",\\\"message\\\":\\\"perator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-autoscaler-operator openshift-machine-api b8e0040a-0eca-4299-ac4a-f26a24879998 4394 0 2025-02-23 05:12:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-autoscaler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-autoscaler-operator-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0078b6fdb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:metrics,Protocol:TCP,Port:9192,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: cluster-autoscaler-operator,},ClusterIP:10.217.5.245,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamily\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:18Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.728941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.729007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.729025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.729050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.729068 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.783574 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:18 crc kubenswrapper[4837]: E1014 13:02:18.783775 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.831672 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.831726 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.831742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.831762 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.831780 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.934337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.934386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.934403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.934426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:18 crc kubenswrapper[4837]: I1014 13:02:18.934442 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:18Z","lastTransitionTime":"2025-10-14T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.037342 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.037414 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.037437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.037467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.037489 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.140448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.140512 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.140531 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.140560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.140581 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.243225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.243275 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.243290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.243312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.243330 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.346468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.346523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.346541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.346566 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.346583 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.449534 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.449599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.449617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.449642 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.449659 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.553152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.553253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.553273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.553296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.553313 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.654628 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.654692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.654710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.654735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.654752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.714044 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.717837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.717864 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.717872 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.717886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.717896 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.730681 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.734916 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.734976 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.734991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.735014 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.735030 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.751478 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.755589 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.755648 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.755666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.755689 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.755707 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.771798 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.775607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.775666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.775685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.775707 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.775723 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.783413 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.783494 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.783549 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.783497 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.783788 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.783965 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.788627 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:19Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:19 crc kubenswrapper[4837]: E1014 13:02:19.788789 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.790479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.790523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.790538 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.790560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.790577 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.893354 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.893399 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.893435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.893453 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.893465 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.996485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.996543 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.996560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.996583 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:19 crc kubenswrapper[4837]: I1014 13:02:19.996600 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:19Z","lastTransitionTime":"2025-10-14T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.099256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.099329 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.099346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.099367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.099383 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.202771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.202840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.202858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.202883 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.202904 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.306180 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.306214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.306223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.306260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.306273 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.408848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.408910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.408937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.408965 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.408988 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.511958 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.512047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.512065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.512090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.512107 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.614762 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.614837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.614865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.614894 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.614914 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.718277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.718344 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.718366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.718395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.718420 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.784327 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:20 crc kubenswrapper[4837]: E1014 13:02:20.784518 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.822645 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.822726 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.822748 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.822779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.822803 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.925505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.925549 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.925559 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.925577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:20 crc kubenswrapper[4837]: I1014 13:02:20.925615 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:20Z","lastTransitionTime":"2025-10-14T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.028446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.028496 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.028506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.028523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.028535 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.132091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.132201 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.132263 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.132293 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.132315 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.235443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.235476 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.235490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.235509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.235522 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.337997 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.338065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.338085 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.338152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.338224 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.440842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.440921 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.440939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.440965 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.440990 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.543431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.543482 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.543501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.543525 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.543542 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.645856 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.645919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.645937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.645965 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.645982 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.749629 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.749697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.749716 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.749746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.749767 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.784297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.784361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.784320 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:21 crc kubenswrapper[4837]: E1014 13:02:21.784516 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:21 crc kubenswrapper[4837]: E1014 13:02:21.784728 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:21 crc kubenswrapper[4837]: E1014 13:02:21.784947 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.852791 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.853036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.853053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.853078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.853095 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.956257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.956320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.956343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.956372 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:21 crc kubenswrapper[4837]: I1014 13:02:21.956394 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:21Z","lastTransitionTime":"2025-10-14T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.059550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.059590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.059599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.059618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.059627 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.163044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.163108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.163128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.163154 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.163201 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.266062 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.266131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.266150 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.266211 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.266236 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.369636 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.369727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.369754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.369787 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.369810 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.473337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.473394 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.473413 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.473436 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.473453 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.577568 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.577631 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.577648 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.577673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.577689 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.681331 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.681416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.681438 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.681473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.681498 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.783958 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.784325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.784468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.784484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.784507 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.784524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: E1014 13:02:22.784529 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.803215 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45115602-86ba-4cc2-9bf2-6b28caa24da7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aeafa6951bf71c8ca5ddef998d6f7c4d376ec0a1d9d0a35039cc1e655fc9e406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f18b0fbdb93e601d9bb2761fa7f42abcb06e91764387543d67946fc57f44b2e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfdb1f7c7226ca7ed39aaa71a9e2aadf5e4c5f72d941c697e282e4c2e741c1d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4694d32db4d0410b8392e3ab110d15e0947a102ffe7565ef248a29fb1b8ab8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e414176a32f813e098df483dd1b0f4470a34399c0306597a20a78faaefc127a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-14T13:01:16Z\\\",\\\"message\\\":\\\"W1014 13:01:05.923625 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1014 13:01:05.923993 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760446865 cert, and key in /tmp/serving-cert-1988714183/serving-signer.crt, /tmp/serving-cert-1988714183/serving-signer.key\\\\nI1014 13:01:06.221191 1 observer_polling.go:159] Starting file observer\\\\nW1014 13:01:06.223530 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1014 13:01:06.223790 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1014 13:01:06.225014 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1988714183/tls.crt::/tmp/serving-cert-1988714183/tls.key\\\\\\\"\\\\nF1014 13:01:16.599502 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e20adc4aed87dd2a211b35f034629c207eccd39b9a3698d36cb2fb2dfd25412\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891828baafa95641ed8cc7ab55e04f2aa7aaa4855441d7e35e98f2907c94e99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.822459 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.845539 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.859507 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132aa5d326e8d7c1b5ce4534b4002bac581d409ce362ed4a6d719c480ff42ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.879000 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l7bgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4ade557-3f1e-4a87-8269-24f33cdafcef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6720522797a315c93b12c85a9618d23c1510c09a9601a3ceef35320cfcbef9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ld24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l7bgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.887136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.887234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.887260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.887303 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.887329 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.905942 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6cc051da3f70e3d05b5a8c5ae6a476b897dd0aa64d8b7db5ef73c61515ec2d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xf45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h4ggd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.923836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3f3598e-e8c5-4960-bbb5-03aa5084e4da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d1c241d4937f454683b3d8d3105d8c3f254a2e5acedff50e4baac0d6f1b666e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35212c08596e69e4b20b419939dd054710cda9b3b9525c77b29480c5cb734aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3ce55d19472f7fdbf9e207eb28f1023c4935343ee749ae7396b191121491ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6afef88bdf7f537728996ef8501adc5e19382b2009ae78a54bbf195794e49ef5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.947116 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f670a3c6-520c-45ba-980a-00c63703b02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:16Z\\\",\\\"message\\\":\\\"perator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{cluster-autoscaler-operator openshift-machine-api b8e0040a-0eca-4299-ac4a-f26a24879998 4394 0 2025-02-23 05:12:30 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:cluster-autoscaler-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:cluster-autoscaler-operator-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0078b6fdb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},ServicePort{Name:metrics,Protocol:TCP,Port:9192,TargetPort:{1 0 metrics},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: cluster-autoscaler-operator,},ClusterIP:10.217.5.245,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamily\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:02:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6t64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xfw4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.972042 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17cfb80c-b9a1-4c26-9be5-431f5da0c786\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aae54115cc40fb4ff3698b61ec9fb37f0ad497c7bd7f57721cd49f7d9416e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c5262fb5314724330c41733f84be25270de7377e34a501a92092da7a5db139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab8683651dff35fcc49bd546cd6fbe586adc98d39183fbed8a0adf04d574f823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebfd51457918f76bdaee05850dc43dc30283adaf7946ae9656be7ca029ba9fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3c9fb1a83c4f1171a6d8481354c6f138aae501d334849fccdd411cbafac476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d780c97a2ef164ae940c52135686b9838a51b15376765497a44d37463ce5f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10db9cdf8f03322b31995e802f2d728b1aa670e8f46bbf7affec62839cebfe86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75d6abc0fda4fb8966365371eb1469588e70e2f4740c7dc9ebffa66e26d0eebd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.990032 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e3d286139fc6a837dd195e329a6277d8d6e6f91e3652fb9f6ea6ebea860b31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:22Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.990371 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.990397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.990415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.990438 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:22 crc kubenswrapper[4837]: I1014 13:02:22.990454 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:22Z","lastTransitionTime":"2025-10-14T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.009290 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf2b95a2db3e1e19507139f3fccfd50f5b83e8e349ad9f9edafdb1203f93a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d74791e5518332fb531ca8b034dc82ec2cf60475172ccb91428725c7e69d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.033864 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r24ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"724908de-ffce-4ba4-8695-c9757f3b9b73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90a5f566eaf2efdc55980e212ac1cfb5a16b72757d69ff30ee80ccc1a8ef7a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71d176bde4f73c00ab68153f1b83871c358793566ad07b4386e707abf6ba5e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ddc34ffb6cdcd63dfaffb8d84cf14ab54f27f658c9ac59c0e425054f8559c28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654822029da77ab6f0e8bed50e4a03965a9d01f0ab6676b6d7af2c93d9609500\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0ba36df718ea836653f4e7be089005d39470c9b045b53c59306809f26699067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da4a61513b0e14bde79ff00b428ed162116aa577a6cfba66a5ab40ae4f4abf43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dfe024d33a16b55ab97f5642a568dfa67d6b49d927d29d9c302bffde97c7f58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z2nhc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r24ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.057866 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6qr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01492025-d672-4746-af22-53fa41a3f612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-14T13:02:11Z\\\",\\\"message\\\":\\\"2025-10-14T13:01:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f\\\\n2025-10-14T13:01:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_db38d11b-bc7c-4500-945f-eede84bc5d0f to /host/opt/cni/bin/\\\\n2025-10-14T13:01:26Z [verbose] multus-daemon started\\\\n2025-10-14T13:01:26Z [verbose] Readiness Indicator file check\\\\n2025-10-14T13:02:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:02:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6qr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.077119 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2xkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3e1251-eebb-4db2-8db1-1d8c63a7660b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6657e38282b9602c0cbd1c654b0e02a4139a1fbe8493819fd78b39c6e3a22fef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2hmwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2xkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.097706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.097756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.097768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.097782 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.097792 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.097748 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2200b076-edbb-461f-bf0e-a3e9c81f4b73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://468abe37900b429121a1b0b5496c5d6e2095841df06cbf3f69808626549f14da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3002489af6587f94b4218885d4966d01e010411f4befe4ade88930a207bcf536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a6d32569858aa0605d4ac90b41411baccccd783082dab62a380ccc2d15ce83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://101d0c7a19e40e667c22ba1c74835447d2aa183b184859429e1ec2e38bdfea37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-14T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-14T13:01:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.115498 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5404624d-032f-4f37-a72e-101c5d301082\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc79437649bc568586808a6ef3e8f48818ee55b20849a2af4ec443e6e9c9c5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88fdb331e71698668175ac751c88fd1e9370014a960854cdc6ccdf487d8a5c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-14T13:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9lzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bgqhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.130392 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c934a24-9e12-46eb-851e-1a6925dc8909\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8v2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-14T13:01:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pcpcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.145683 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-14T13:01:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:23Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.200957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.201033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.201059 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.201091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.201113 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.304050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.304117 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.304135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.304185 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.304206 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.407292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.407346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.407367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.407390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.407407 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.511249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.511338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.511366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.511398 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.511422 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.614621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.614681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.614698 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.614722 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.614740 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.718070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.718145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.718195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.718223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.718241 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.784434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.784510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.784504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:23 crc kubenswrapper[4837]: E1014 13:02:23.784703 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:23 crc kubenswrapper[4837]: E1014 13:02:23.784893 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:23 crc kubenswrapper[4837]: E1014 13:02:23.785035 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.822146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.822228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.822246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.822270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.822288 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.925710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.925775 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.925793 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.925819 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:23 crc kubenswrapper[4837]: I1014 13:02:23.925837 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:23Z","lastTransitionTime":"2025-10-14T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.029576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.029652 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.029668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.029696 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.029795 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.133523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.133588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.133607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.133633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.133652 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.237112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.237230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.237255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.237284 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.237306 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.340040 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.340113 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.340132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.340185 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.340205 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.443760 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.443827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.443847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.443873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.443892 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.548203 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.548276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.548293 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.548321 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.548338 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.655858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.655920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.655940 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.655970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.655990 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.759026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.759063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.759079 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.759102 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.759120 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.784094 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:24 crc kubenswrapper[4837]: E1014 13:02:24.784353 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.862393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.862456 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.862473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.862500 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.862518 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.965958 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.966042 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.966061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.966097 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:24 crc kubenswrapper[4837]: I1014 13:02:24.966121 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:24Z","lastTransitionTime":"2025-10-14T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.069198 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.069295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.069311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.069339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.069359 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.172215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.172278 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.172296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.172322 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.172342 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.274797 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.274860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.274876 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.274901 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.274920 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.378231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.378305 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.378327 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.378359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.378380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.481680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.481744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.481764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.481791 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.481810 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.585151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.585268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.585291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.585325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.585343 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.691195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.691252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.691263 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.691280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.691291 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.778965 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.779068 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.779102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.779128 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.779176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779272 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779328 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.779310892 +0000 UTC m=+147.696310715 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779442 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779484 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779496 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779546 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.779508557 +0000 UTC m=+147.696508380 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779448 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779641 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779666 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779697 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779589 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.779576229 +0000 UTC m=+147.696576152 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779813 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.779748593 +0000 UTC m=+147.696748436 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.779841 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.779828206 +0000 UTC m=+147.696828169 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.783740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.783751 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.783933 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.783799 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.784050 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:25 crc kubenswrapper[4837]: E1014 13:02:25.784246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.793765 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.793803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.793817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.793834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.793846 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.897011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.897091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.897113 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.897143 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:25 crc kubenswrapper[4837]: I1014 13:02:25.897217 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:25Z","lastTransitionTime":"2025-10-14T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.000834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.000914 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.000934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.000958 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.000977 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.103683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.103763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.103788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.103816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.103837 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.206352 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.206430 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.206443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.206462 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.206476 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.309680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.309747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.309770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.309798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.309820 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.413332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.413378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.413393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.413415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.413431 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.516393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.516450 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.516472 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.516504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.516526 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.619337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.619384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.619400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.619423 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.619440 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.722807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.722866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.722891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.722919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.722940 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.784519 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:26 crc kubenswrapper[4837]: E1014 13:02:26.784730 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.826002 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.826062 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.826095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.826123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.826143 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.930081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.930187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.930232 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.930262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:26 crc kubenswrapper[4837]: I1014 13:02:26.930284 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:26Z","lastTransitionTime":"2025-10-14T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.033877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.033939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.033960 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.033984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.034004 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.137500 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.137558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.137575 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.137599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.137617 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.240971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.241035 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.241052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.241078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.241098 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.343557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.343634 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.343656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.343823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.343948 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.446953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.447017 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.447044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.447075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.447098 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.549668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.549748 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.549766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.549792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.549809 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.652923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.652998 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.653018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.653046 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.653066 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.756108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.756206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.756225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.756250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.756269 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.784112 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.784222 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.784127 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:27 crc kubenswrapper[4837]: E1014 13:02:27.784371 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:27 crc kubenswrapper[4837]: E1014 13:02:27.784602 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:27 crc kubenswrapper[4837]: E1014 13:02:27.784821 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.858365 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.858437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.858455 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.858481 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.858500 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.962341 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.962404 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.962421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.962447 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:27 crc kubenswrapper[4837]: I1014 13:02:27.962464 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:27Z","lastTransitionTime":"2025-10-14T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.066076 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.066129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.066147 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.066224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.066248 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.169146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.169225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.169237 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.169256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.169268 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.273095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.273150 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.273194 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.273217 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.273234 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.377391 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.377484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.377527 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.377567 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.377594 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.481443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.481504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.481522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.481544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.481563 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.583781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.583857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.583879 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.583911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.583935 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.686502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.686545 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.686556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.686572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.686584 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.784562 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:28 crc kubenswrapper[4837]: E1014 13:02:28.784783 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.789991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.790065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.790090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.790118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.790140 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.893506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.893568 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.893588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.893614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.893630 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.996108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.996214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.996239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.996271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:28 crc kubenswrapper[4837]: I1014 13:02:28.996293 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:28Z","lastTransitionTime":"2025-10-14T13:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.099815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.099868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.099887 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.099907 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.099920 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.203217 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.203290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.203315 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.203348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.203370 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.307276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.307345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.307362 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.307388 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.307406 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.410866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.410928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.410945 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.410969 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.410987 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.513776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.513844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.513861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.513885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.513906 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.616591 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.616736 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.616753 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.616776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.616793 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.720251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.720330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.720354 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.720389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.720412 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.785961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.786219 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.786471 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.786561 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.786690 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.786930 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.824080 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.824221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.824254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.824286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.824311 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.892371 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.892415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.892426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.892442 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.892452 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.913062 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.918492 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.918563 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.918580 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.918606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.918623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.941260 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.946949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.947036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.947053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.947071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.947122 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.965023 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.969956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.970053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.970071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.970096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.970116 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:29 crc kubenswrapper[4837]: E1014 13:02:29.991119 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:29Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.997050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.997112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.997145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.997204 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:29 crc kubenswrapper[4837]: I1014 13:02:29.997226 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:29Z","lastTransitionTime":"2025-10-14T13:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: E1014 13:02:30.024729 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-14T13:02:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0f27438a-32be-43a2-9e58-7bdea433e25c\\\",\\\"systemUUID\\\":\\\"ea84f05e-4f20-4ec0-a4d1-23ededd0f865\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-14T13:02:30Z is after 2025-08-24T17:21:41Z" Oct 14 13:02:30 crc kubenswrapper[4837]: E1014 13:02:30.024907 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.027056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.027090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.027100 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.027115 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.027126 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.130639 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.130729 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.130747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.130769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.130787 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.233663 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.233733 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.233750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.233775 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.233795 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.336938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.337029 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.337049 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.337074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.337089 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.440247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.440327 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.440350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.440385 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.440410 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.542402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.542465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.542482 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.542508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.542526 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.645123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.645354 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.645381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.645409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.645430 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.748840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.748896 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.748913 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.748938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.748960 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.784453 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:30 crc kubenswrapper[4837]: E1014 13:02:30.784664 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.852227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.852274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.852290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.852314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.852331 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.955718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.955774 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.955790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.955805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:30 crc kubenswrapper[4837]: I1014 13:02:30.955817 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:30Z","lastTransitionTime":"2025-10-14T13:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.058866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.058907 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.058922 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.058946 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.058962 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.162084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.162218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.162246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.162275 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.162301 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.266128 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.266222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.266243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.266268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.266286 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.368749 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.368819 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.368842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.368871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.368893 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.472108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.472216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.472234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.472259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.472278 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.575673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.575758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.575776 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.575804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.575825 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.678884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.678947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.678967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.678994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.679012 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.781610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.781690 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.781708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.781738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.781758 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.784221 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.784279 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.784228 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:31 crc kubenswrapper[4837]: E1014 13:02:31.784388 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:31 crc kubenswrapper[4837]: E1014 13:02:31.784550 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:31 crc kubenswrapper[4837]: E1014 13:02:31.784734 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.785664 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:02:31 crc kubenswrapper[4837]: E1014 13:02:31.785905 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.889951 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.890190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.890252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.890299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.890340 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.994033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.994111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.994124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.994143 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:31 crc kubenswrapper[4837]: I1014 13:02:31.994180 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:31Z","lastTransitionTime":"2025-10-14T13:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.097400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.097479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.097498 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.097528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.097546 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.202378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.202472 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.202501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.202536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.202566 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.305949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.306056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.306078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.306114 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.306141 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.409550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.409631 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.409654 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.409681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.409699 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.512404 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.512456 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.512473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.512494 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.512513 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.615357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.615459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.615478 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.615504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.615524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.718633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.718703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.718729 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.718757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.718778 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.784406 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:32 crc kubenswrapper[4837]: E1014 13:02:32.784601 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.832843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.832898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.832915 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.832939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.832957 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.862831 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l7bgt" podStartSLOduration=69.862802831 podStartE2EDuration="1m9.862802831s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:32.862117152 +0000 UTC m=+90.779117045" watchObservedRunningTime="2025-10-14 13:02:32.862802831 +0000 UTC m=+90.779802684" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.885810 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podStartSLOduration=69.885780917 podStartE2EDuration="1m9.885780917s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:32.885348685 +0000 UTC m=+90.802348578" watchObservedRunningTime="2025-10-14 13:02:32.885780917 +0000 UTC m=+90.802780770" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.907563 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.907539908 podStartE2EDuration="1m7.907539908s" podCreationTimestamp="2025-10-14 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:32.907499157 +0000 UTC m=+90.824499040" watchObservedRunningTime="2025-10-14 13:02:32.907539908 +0000 UTC m=+90.824539731" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.929014 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.928980953 podStartE2EDuration="1m12.928980953s" podCreationTimestamp="2025-10-14 13:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:32.928598213 +0000 UTC m=+90.845598096" watchObservedRunningTime="2025-10-14 13:02:32.928980953 +0000 UTC m=+90.845980806" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.935798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.935903 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.935922 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.935946 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:32 crc kubenswrapper[4837]: I1014 13:02:32.935966 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:32Z","lastTransitionTime":"2025-10-14T13:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.029560 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r24ng" podStartSLOduration=70.029526035 podStartE2EDuration="1m10.029526035s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:33.009446527 +0000 UTC m=+90.926446380" watchObservedRunningTime="2025-10-14 13:02:33.029526035 +0000 UTC m=+90.946525888" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.038271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.038355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.038380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.038406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.038427 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.053266 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s6qr4" podStartSLOduration=70.053232219 podStartE2EDuration="1m10.053232219s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:33.03086583 +0000 UTC m=+90.947865683" watchObservedRunningTime="2025-10-14 13:02:33.053232219 +0000 UTC m=+90.970232082" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.053988 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q2xkc" podStartSLOduration=70.053974519 podStartE2EDuration="1m10.053974519s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:33.046265943 +0000 UTC m=+90.963265836" watchObservedRunningTime="2025-10-14 13:02:33.053974519 +0000 UTC m=+90.970974372" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.070360 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.070337668 podStartE2EDuration="41.070337668s" podCreationTimestamp="2025-10-14 13:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:33.068639412 +0000 UTC m=+90.985639325" watchObservedRunningTime="2025-10-14 13:02:33.070337668 +0000 UTC m=+90.987337521" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.116447 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.116413281 podStartE2EDuration="1m7.116413281s" podCreationTimestamp="2025-10-14 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:33.114962211 +0000 UTC m=+91.031962104" watchObservedRunningTime="2025-10-14 13:02:33.116413281 +0000 UTC m=+91.033413134" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.141044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.141084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.141098 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.141129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.141146 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.199291 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bgqhk" podStartSLOduration=69.199272059 podStartE2EDuration="1m9.199272059s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:33.182221692 +0000 UTC m=+91.099221525" watchObservedRunningTime="2025-10-14 13:02:33.199272059 +0000 UTC m=+91.116271872" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.243063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.243104 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.243114 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.243126 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.243135 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.346326 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.346381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.346397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.346428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.346451 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.453219 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.453272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.453293 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.453409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.453431 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.556238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.556307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.556338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.556373 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.556389 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.659224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.659320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.659342 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.659366 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.659386 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.762344 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.762410 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.762421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.762462 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.762475 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.783790 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.783854 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:33 crc kubenswrapper[4837]: E1014 13:02:33.783924 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.783951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:33 crc kubenswrapper[4837]: E1014 13:02:33.784073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:33 crc kubenswrapper[4837]: E1014 13:02:33.784201 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.865145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.865272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.865295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.865323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.865342 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.968831 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.968908 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.968932 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.968962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:33 crc kubenswrapper[4837]: I1014 13:02:33.968981 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:33Z","lastTransitionTime":"2025-10-14T13:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.072359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.072428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.072452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.072484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.072508 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.175397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.175446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.175463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.175490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.175506 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.277812 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.277887 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.277909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.277933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.277951 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.380815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.380874 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.380891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.380916 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.380934 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.484501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.484571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.484589 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.484619 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.484641 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.587978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.588101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.588123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.588148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.588210 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.691670 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.691736 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.691754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.691777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.691793 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.784093 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:34 crc kubenswrapper[4837]: E1014 13:02:34.784537 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.833447 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.833512 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.833531 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.833558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.833577 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.841671 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.936868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.936938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.936963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.936993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:34 crc kubenswrapper[4837]: I1014 13:02:34.937015 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:34Z","lastTransitionTime":"2025-10-14T13:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.040058 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.040106 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.040124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.040146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.040217 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.143691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.144096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.144122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.144152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.144250 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.247283 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.247389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.247411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.247435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.247454 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.350650 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.350686 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.350698 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.350713 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.350725 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.453551 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.453626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.453651 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.453682 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.453705 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.557338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.557400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.557417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.557442 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.557461 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.660668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.660742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.660765 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.660799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.660821 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.764357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.764431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.764454 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.764483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.764506 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.783764 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:35 crc kubenswrapper[4837]: E1014 13:02:35.783909 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.784138 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:35 crc kubenswrapper[4837]: E1014 13:02:35.784321 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.784398 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:35 crc kubenswrapper[4837]: E1014 13:02:35.784525 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.867075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.867139 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.867193 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.867220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.867237 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.970323 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.970383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.970401 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.970426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:35 crc kubenswrapper[4837]: I1014 13:02:35.970445 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:35Z","lastTransitionTime":"2025-10-14T13:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.073831 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.073888 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.073904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.073928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.073950 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.177582 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.177653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.177671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.177701 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.177721 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.281011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.281091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.281112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.281142 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.281201 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.384488 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.384535 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.384552 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.384574 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.384590 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.487814 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.487861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.487869 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.487885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.487895 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.590701 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.590758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.590797 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.590826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.590846 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.693489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.693556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.693577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.693603 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.693619 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.784438 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:36 crc kubenswrapper[4837]: E1014 13:02:36.784623 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.796832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.796896 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.796914 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.796939 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.796956 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.899625 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.899693 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.899716 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.899746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:36 crc kubenswrapper[4837]: I1014 13:02:36.899770 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:36Z","lastTransitionTime":"2025-10-14T13:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.003299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.003382 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.003400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.003426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.003445 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.106666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.106755 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.106767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.106783 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.106794 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.210190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.210246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.210261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.210281 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.210294 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.313384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.313437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.313451 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.313470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.313487 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.416417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.416496 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.416523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.416557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.416582 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.519667 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.519721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.519737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.519761 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.519778 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.623320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.623428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.623444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.623463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.623476 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.725706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.725764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.725780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.725802 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.725860 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.784528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.784655 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.784672 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:37 crc kubenswrapper[4837]: E1014 13:02:37.784830 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:37 crc kubenswrapper[4837]: E1014 13:02:37.785039 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:37 crc kubenswrapper[4837]: E1014 13:02:37.785242 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.829335 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.829380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.829388 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.829403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.829417 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.932910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.932973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.932989 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.933013 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:37 crc kubenswrapper[4837]: I1014 13:02:37.933031 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:37Z","lastTransitionTime":"2025-10-14T13:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.036337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.036413 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.036424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.036442 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.036454 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.139739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.139806 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.139823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.139853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.139870 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.243427 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.243499 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.243532 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.243577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.243596 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.346890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.346960 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.346980 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.347007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.347025 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.450119 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.450242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.450269 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.450304 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.450326 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.553472 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.553522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.553539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.553563 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.553581 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.656130 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.656249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.656275 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.656304 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.656321 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.758511 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.758581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.758597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.758623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.758642 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.784461 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:38 crc kubenswrapper[4837]: E1014 13:02:38.784759 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.861922 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.861974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.861991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.862010 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.862023 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.965501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.965560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.965576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.965599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:38 crc kubenswrapper[4837]: I1014 13:02:38.965616 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:38Z","lastTransitionTime":"2025-10-14T13:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.068702 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.068817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.068836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.068862 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.068880 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.171481 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.171528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.171545 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.171569 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.171587 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.274082 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.274133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.274154 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.274215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.274234 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.376891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.376953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.376976 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.377005 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.377026 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.480624 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.480693 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.480720 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.480750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.480769 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.583695 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.583781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.583799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.583819 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.583835 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.686268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.686344 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.686362 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.686389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.686408 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.784418 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.784549 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:39 crc kubenswrapper[4837]: E1014 13:02:39.784659 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.784781 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:39 crc kubenswrapper[4837]: E1014 13:02:39.784935 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:39 crc kubenswrapper[4837]: E1014 13:02:39.785266 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.789673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.789731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.789750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.789773 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.789791 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.892067 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.892105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.892115 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.892131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.892142 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.995221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.995265 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.995281 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.995306 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:39 crc kubenswrapper[4837]: I1014 13:02:39.995325 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:39Z","lastTransitionTime":"2025-10-14T13:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.099712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.099788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.099810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.099835 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.099851 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:40Z","lastTransitionTime":"2025-10-14T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.202760 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.202794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.202804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.202817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.202826 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:40Z","lastTransitionTime":"2025-10-14T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.304638 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.304671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.304680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.304691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.304702 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:40Z","lastTransitionTime":"2025-10-14T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.396995 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.397066 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.397088 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.397110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.397127 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-14T13:02:40Z","lastTransitionTime":"2025-10-14T13:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.469252 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm"] Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.469794 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.472446 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.472644 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.473783 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.473887 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.480994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/41c53d59-45c3-4355-8e06-ed9a941e34da-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.481064 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/41c53d59-45c3-4355-8e06-ed9a941e34da-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.481188 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41c53d59-45c3-4355-8e06-ed9a941e34da-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.481333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c53d59-45c3-4355-8e06-ed9a941e34da-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.481431 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c53d59-45c3-4355-8e06-ed9a941e34da-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.503592 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.503561329 podStartE2EDuration="6.503561329s" podCreationTimestamp="2025-10-14 13:02:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:40.501720499 +0000 UTC m=+98.418720312" watchObservedRunningTime="2025-10-14 13:02:40.503561329 +0000 UTC m=+98.420561182" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582099 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c53d59-45c3-4355-8e06-ed9a941e34da-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c53d59-45c3-4355-8e06-ed9a941e34da-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582227 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/41c53d59-45c3-4355-8e06-ed9a941e34da-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/41c53d59-45c3-4355-8e06-ed9a941e34da-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41c53d59-45c3-4355-8e06-ed9a941e34da-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582519 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/41c53d59-45c3-4355-8e06-ed9a941e34da-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.582534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/41c53d59-45c3-4355-8e06-ed9a941e34da-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.583258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41c53d59-45c3-4355-8e06-ed9a941e34da-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.592341 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41c53d59-45c3-4355-8e06-ed9a941e34da-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.615443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c53d59-45c3-4355-8e06-ed9a941e34da-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llwkm\" (UID: \"41c53d59-45c3-4355-8e06-ed9a941e34da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.784144 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:40 crc kubenswrapper[4837]: E1014 13:02:40.784360 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:40 crc kubenswrapper[4837]: I1014 13:02:40.788956 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" Oct 14 13:02:41 crc kubenswrapper[4837]: I1014 13:02:41.374769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" event={"ID":"41c53d59-45c3-4355-8e06-ed9a941e34da","Type":"ContainerStarted","Data":"ad316617fb6b6207e9e92f4c3868c3244b9c4c0f916744f39d02802272df282c"} Oct 14 13:02:41 crc kubenswrapper[4837]: I1014 13:02:41.374851 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" event={"ID":"41c53d59-45c3-4355-8e06-ed9a941e34da","Type":"ContainerStarted","Data":"01e23d507ec6b40fb4142e45f818030421ba1e062243544f96eaef956fdf6835"} Oct 14 13:02:41 crc kubenswrapper[4837]: I1014 13:02:41.394386 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llwkm" podStartSLOduration=78.394361017 podStartE2EDuration="1m18.394361017s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:02:41.393838063 +0000 UTC m=+99.310837916" watchObservedRunningTime="2025-10-14 13:02:41.394361017 +0000 UTC m=+99.311360860" Oct 14 13:02:41 crc kubenswrapper[4837]: I1014 13:02:41.784125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:41 crc kubenswrapper[4837]: I1014 13:02:41.784192 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:41 crc kubenswrapper[4837]: I1014 13:02:41.784269 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:41 crc kubenswrapper[4837]: E1014 13:02:41.784338 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:41 crc kubenswrapper[4837]: E1014 13:02:41.784453 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:41 crc kubenswrapper[4837]: E1014 13:02:41.784584 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:42 crc kubenswrapper[4837]: I1014 13:02:42.603526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:42 crc kubenswrapper[4837]: E1014 13:02:42.603707 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:02:42 crc kubenswrapper[4837]: E1014 13:02:42.603787 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs podName:7c934a24-9e12-46eb-851e-1a6925dc8909 nodeName:}" failed. No retries permitted until 2025-10-14 13:03:46.603762065 +0000 UTC m=+164.520761908 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs") pod "network-metrics-daemon-pcpcf" (UID: "7c934a24-9e12-46eb-851e-1a6925dc8909") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:02:42 crc kubenswrapper[4837]: I1014 13:02:42.784200 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:42 crc kubenswrapper[4837]: E1014 13:02:42.786119 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:43 crc kubenswrapper[4837]: I1014 13:02:43.783757 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:43 crc kubenswrapper[4837]: I1014 13:02:43.783821 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:43 crc kubenswrapper[4837]: E1014 13:02:43.783937 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:43 crc kubenswrapper[4837]: I1014 13:02:43.784021 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:43 crc kubenswrapper[4837]: E1014 13:02:43.784067 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:43 crc kubenswrapper[4837]: E1014 13:02:43.784310 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:44 crc kubenswrapper[4837]: I1014 13:02:44.784547 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:44 crc kubenswrapper[4837]: E1014 13:02:44.784766 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:44 crc kubenswrapper[4837]: I1014 13:02:44.785302 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:02:44 crc kubenswrapper[4837]: E1014 13:02:44.785515 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:02:45 crc kubenswrapper[4837]: I1014 13:02:45.784522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:45 crc kubenswrapper[4837]: I1014 13:02:45.784561 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:45 crc kubenswrapper[4837]: E1014 13:02:45.784688 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:45 crc kubenswrapper[4837]: E1014 13:02:45.784800 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:45 crc kubenswrapper[4837]: I1014 13:02:45.785385 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:45 crc kubenswrapper[4837]: E1014 13:02:45.785629 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:46 crc kubenswrapper[4837]: I1014 13:02:46.784547 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:46 crc kubenswrapper[4837]: E1014 13:02:46.784743 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:47 crc kubenswrapper[4837]: I1014 13:02:47.784376 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:47 crc kubenswrapper[4837]: I1014 13:02:47.784431 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:47 crc kubenswrapper[4837]: E1014 13:02:47.784561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:47 crc kubenswrapper[4837]: I1014 13:02:47.784636 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:47 crc kubenswrapper[4837]: E1014 13:02:47.784853 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:47 crc kubenswrapper[4837]: E1014 13:02:47.784983 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:48 crc kubenswrapper[4837]: I1014 13:02:48.784641 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:48 crc kubenswrapper[4837]: E1014 13:02:48.784895 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:49 crc kubenswrapper[4837]: I1014 13:02:49.784108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:49 crc kubenswrapper[4837]: I1014 13:02:49.784226 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:49 crc kubenswrapper[4837]: I1014 13:02:49.784123 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:49 crc kubenswrapper[4837]: E1014 13:02:49.784366 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:49 crc kubenswrapper[4837]: E1014 13:02:49.784468 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:49 crc kubenswrapper[4837]: E1014 13:02:49.784593 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:50 crc kubenswrapper[4837]: I1014 13:02:50.786623 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:50 crc kubenswrapper[4837]: E1014 13:02:50.787016 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:51 crc kubenswrapper[4837]: I1014 13:02:51.784402 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:51 crc kubenswrapper[4837]: I1014 13:02:51.784478 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:51 crc kubenswrapper[4837]: I1014 13:02:51.784402 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:51 crc kubenswrapper[4837]: E1014 13:02:51.784574 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:51 crc kubenswrapper[4837]: E1014 13:02:51.784768 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:51 crc kubenswrapper[4837]: E1014 13:02:51.784965 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:52 crc kubenswrapper[4837]: I1014 13:02:52.784121 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:52 crc kubenswrapper[4837]: E1014 13:02:52.784775 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:53 crc kubenswrapper[4837]: I1014 13:02:53.784339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:53 crc kubenswrapper[4837]: I1014 13:02:53.784437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:53 crc kubenswrapper[4837]: I1014 13:02:53.784367 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:53 crc kubenswrapper[4837]: E1014 13:02:53.784532 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:53 crc kubenswrapper[4837]: E1014 13:02:53.784622 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:53 crc kubenswrapper[4837]: E1014 13:02:53.784776 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:54 crc kubenswrapper[4837]: I1014 13:02:54.784361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:54 crc kubenswrapper[4837]: E1014 13:02:54.784648 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:55 crc kubenswrapper[4837]: I1014 13:02:55.784207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:55 crc kubenswrapper[4837]: I1014 13:02:55.784326 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:55 crc kubenswrapper[4837]: I1014 13:02:55.784393 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:55 crc kubenswrapper[4837]: E1014 13:02:55.784522 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:55 crc kubenswrapper[4837]: E1014 13:02:55.785040 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:55 crc kubenswrapper[4837]: E1014 13:02:55.785853 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:55 crc kubenswrapper[4837]: I1014 13:02:55.786479 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:02:55 crc kubenswrapper[4837]: E1014 13:02:55.786729 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xfw4j_openshift-ovn-kubernetes(f670a3c6-520c-45ba-980a-00c63703b02b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" Oct 14 13:02:56 crc kubenswrapper[4837]: I1014 13:02:56.784232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:56 crc kubenswrapper[4837]: E1014 13:02:56.784459 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:57 crc kubenswrapper[4837]: I1014 13:02:57.784443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:57 crc kubenswrapper[4837]: I1014 13:02:57.784525 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:57 crc kubenswrapper[4837]: I1014 13:02:57.784451 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:57 crc kubenswrapper[4837]: E1014 13:02:57.784614 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:02:57 crc kubenswrapper[4837]: E1014 13:02:57.784745 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:57 crc kubenswrapper[4837]: E1014 13:02:57.784909 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.433062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/1.log" Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.433755 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/0.log" Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.433849 4837 generic.go:334] "Generic (PLEG): container finished" podID="01492025-d672-4746-af22-53fa41a3f612" containerID="01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5" exitCode=1 Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.433902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerDied","Data":"01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5"} Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.433953 4837 scope.go:117] "RemoveContainer" containerID="ccb1463a0f49860883c61a521361b0592c66a80093d850b27c6b60cc76daedcc" Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.434540 4837 scope.go:117] "RemoveContainer" containerID="01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5" Oct 14 13:02:58 crc kubenswrapper[4837]: E1014 13:02:58.434806 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-s6qr4_openshift-multus(01492025-d672-4746-af22-53fa41a3f612)\"" pod="openshift-multus/multus-s6qr4" podUID="01492025-d672-4746-af22-53fa41a3f612" Oct 14 13:02:58 crc kubenswrapper[4837]: I1014 13:02:58.783642 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:02:58 crc kubenswrapper[4837]: E1014 13:02:58.783874 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:02:59 crc kubenswrapper[4837]: I1014 13:02:59.441972 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/1.log" Oct 14 13:02:59 crc kubenswrapper[4837]: I1014 13:02:59.784406 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:02:59 crc kubenswrapper[4837]: I1014 13:02:59.784470 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:02:59 crc kubenswrapper[4837]: I1014 13:02:59.784475 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:02:59 crc kubenswrapper[4837]: E1014 13:02:59.784676 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:02:59 crc kubenswrapper[4837]: E1014 13:02:59.784843 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:02:59 crc kubenswrapper[4837]: E1014 13:02:59.785317 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:00 crc kubenswrapper[4837]: I1014 13:03:00.783762 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:00 crc kubenswrapper[4837]: E1014 13:03:00.784243 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:01 crc kubenswrapper[4837]: I1014 13:03:01.784042 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:01 crc kubenswrapper[4837]: I1014 13:03:01.784091 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:01 crc kubenswrapper[4837]: I1014 13:03:01.784267 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:01 crc kubenswrapper[4837]: E1014 13:03:01.784404 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:01 crc kubenswrapper[4837]: E1014 13:03:01.784584 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:01 crc kubenswrapper[4837]: E1014 13:03:01.784735 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:02 crc kubenswrapper[4837]: E1014 13:03:02.728756 4837 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 14 13:03:02 crc kubenswrapper[4837]: I1014 13:03:02.784057 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:02 crc kubenswrapper[4837]: E1014 13:03:02.786029 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:02 crc kubenswrapper[4837]: E1014 13:03:02.872639 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 14 13:03:03 crc kubenswrapper[4837]: I1014 13:03:03.784202 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:03 crc kubenswrapper[4837]: I1014 13:03:03.784256 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:03 crc kubenswrapper[4837]: I1014 13:03:03.784215 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:03 crc kubenswrapper[4837]: E1014 13:03:03.784317 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:03 crc kubenswrapper[4837]: E1014 13:03:03.784496 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:03 crc kubenswrapper[4837]: E1014 13:03:03.784637 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:04 crc kubenswrapper[4837]: I1014 13:03:04.783855 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:04 crc kubenswrapper[4837]: E1014 13:03:04.784111 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:05 crc kubenswrapper[4837]: I1014 13:03:05.783960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:05 crc kubenswrapper[4837]: I1014 13:03:05.784039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:05 crc kubenswrapper[4837]: E1014 13:03:05.784209 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:05 crc kubenswrapper[4837]: I1014 13:03:05.784239 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:05 crc kubenswrapper[4837]: E1014 13:03:05.784361 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:05 crc kubenswrapper[4837]: E1014 13:03:05.784516 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:06 crc kubenswrapper[4837]: I1014 13:03:06.784190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:06 crc kubenswrapper[4837]: E1014 13:03:06.784449 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:07 crc kubenswrapper[4837]: I1014 13:03:07.783885 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:07 crc kubenswrapper[4837]: I1014 13:03:07.783906 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:07 crc kubenswrapper[4837]: I1014 13:03:07.784219 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:07 crc kubenswrapper[4837]: E1014 13:03:07.784078 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:07 crc kubenswrapper[4837]: E1014 13:03:07.784359 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:07 crc kubenswrapper[4837]: E1014 13:03:07.784452 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:07 crc kubenswrapper[4837]: E1014 13:03:07.874054 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 14 13:03:08 crc kubenswrapper[4837]: I1014 13:03:08.797206 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:08 crc kubenswrapper[4837]: E1014 13:03:08.797428 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:09 crc kubenswrapper[4837]: I1014 13:03:09.784373 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:09 crc kubenswrapper[4837]: I1014 13:03:09.784401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:09 crc kubenswrapper[4837]: E1014 13:03:09.784579 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:09 crc kubenswrapper[4837]: E1014 13:03:09.784732 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:09 crc kubenswrapper[4837]: I1014 13:03:09.785133 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:09 crc kubenswrapper[4837]: E1014 13:03:09.785504 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:10 crc kubenswrapper[4837]: I1014 13:03:10.784089 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:10 crc kubenswrapper[4837]: E1014 13:03:10.784764 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:10 crc kubenswrapper[4837]: I1014 13:03:10.785253 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.486990 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/3.log" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.490321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerStarted","Data":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.490694 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.532251 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podStartSLOduration=107.532237051 podStartE2EDuration="1m47.532237051s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:11.531944324 +0000 UTC m=+129.448944177" watchObservedRunningTime="2025-10-14 13:03:11.532237051 +0000 UTC m=+129.449236864" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.783607 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:11 crc kubenswrapper[4837]: E1014 13:03:11.783747 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.783852 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.783878 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:11 crc kubenswrapper[4837]: E1014 13:03:11.784085 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:11 crc kubenswrapper[4837]: E1014 13:03:11.784185 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.818192 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pcpcf"] Oct 14 13:03:11 crc kubenswrapper[4837]: I1014 13:03:11.818314 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:11 crc kubenswrapper[4837]: E1014 13:03:11.818411 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:12 crc kubenswrapper[4837]: E1014 13:03:12.874896 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 14 13:03:13 crc kubenswrapper[4837]: I1014 13:03:13.784481 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:13 crc kubenswrapper[4837]: I1014 13:03:13.784597 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:13 crc kubenswrapper[4837]: I1014 13:03:13.785098 4837 scope.go:117] "RemoveContainer" containerID="01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5" Oct 14 13:03:13 crc kubenswrapper[4837]: I1014 13:03:13.785226 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:13 crc kubenswrapper[4837]: E1014 13:03:13.785392 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:13 crc kubenswrapper[4837]: I1014 13:03:13.785510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:13 crc kubenswrapper[4837]: E1014 13:03:13.785621 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:13 crc kubenswrapper[4837]: E1014 13:03:13.785805 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:13 crc kubenswrapper[4837]: E1014 13:03:13.786142 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:14 crc kubenswrapper[4837]: I1014 13:03:14.504409 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/1.log" Oct 14 13:03:14 crc kubenswrapper[4837]: I1014 13:03:14.504753 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerStarted","Data":"9b9800c7caf14c369455bcd4d508943981bd965b9cfd5812889d0c36580034f5"} Oct 14 13:03:15 crc kubenswrapper[4837]: I1014 13:03:15.784220 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:15 crc kubenswrapper[4837]: E1014 13:03:15.784994 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:15 crc kubenswrapper[4837]: I1014 13:03:15.784288 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:15 crc kubenswrapper[4837]: E1014 13:03:15.785297 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:15 crc kubenswrapper[4837]: I1014 13:03:15.784245 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:15 crc kubenswrapper[4837]: E1014 13:03:15.785573 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:15 crc kubenswrapper[4837]: I1014 13:03:15.784322 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:15 crc kubenswrapper[4837]: E1014 13:03:15.785828 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:17 crc kubenswrapper[4837]: I1014 13:03:17.784482 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:17 crc kubenswrapper[4837]: I1014 13:03:17.784516 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:17 crc kubenswrapper[4837]: I1014 13:03:17.784629 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:17 crc kubenswrapper[4837]: E1014 13:03:17.784820 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pcpcf" podUID="7c934a24-9e12-46eb-851e-1a6925dc8909" Oct 14 13:03:17 crc kubenswrapper[4837]: I1014 13:03:17.785097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:17 crc kubenswrapper[4837]: E1014 13:03:17.785242 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 14 13:03:17 crc kubenswrapper[4837]: E1014 13:03:17.785363 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 14 13:03:17 crc kubenswrapper[4837]: E1014 13:03:17.785538 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.784513 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.784797 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.784912 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.784951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.788597 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.788864 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.789888 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.790618 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.790980 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 14 13:03:19 crc kubenswrapper[4837]: I1014 13:03:19.791953 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.226129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.284850 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k88lw"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.286690 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.298439 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.298637 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.298678 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.299928 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.304816 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.308119 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dfjx8"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.308987 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.309673 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.310011 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.310644 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.320570 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dfwxh"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.321397 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.322547 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.324127 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.327692 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.328721 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.333437 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-twbvc"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.333975 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nmf8f"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.334463 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.334604 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.335025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60cdb7dd-63cf-4f28-ab2f-b58de493e006-audit-dir\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.335378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-serving-cert\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.335583 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60cdb7dd-63cf-4f28-ab2f-b58de493e006-node-pullsecrets\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.335726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-etcd-serving-ca\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.335874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-encryption-config\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.336067 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-image-import-ca\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.336478 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-audit\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.336652 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.336929 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-config\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.337108 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6q7t\" (UniqueName: \"kubernetes.io/projected/60cdb7dd-63cf-4f28-ab2f-b58de493e006-kube-api-access-s6q7t\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.338329 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-etcd-client\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.337525 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.338775 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8v59z"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.339570 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.340507 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.341245 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.343892 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.344778 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.345432 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.346218 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.353245 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.353802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.354193 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.354313 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-kjs2b"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.354663 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.354938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.370497 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.377581 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.378967 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.379570 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.379891 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.380055 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.380774 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.381472 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.381811 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.381906 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.381995 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.382110 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.382227 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.382430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.384051 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.384220 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqs75"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.385013 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.392898 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.395818 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6vd5d"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.401636 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.396460 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.395904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.401691 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7fw57"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.395976 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.396340 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.401838 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.401866 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.403317 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.425188 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.425497 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.443769 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444050 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-image-import-ca\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444092 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444116 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-serving-cert\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444135 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444175 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444139 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-audit\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444358 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-dir\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444500 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12d04376-4d45-4906-9772-84f7c9d313bf-config\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444525 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41984359-fc99-4678-962a-b8c09f7c8e26-serving-cert\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444545 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444584 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhtf\" (UniqueName: \"kubernetes.io/projected/99a2942a-8cfe-42b7-a339-4d7b30ee12be-kube-api-access-cbhtf\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6q7t\" (UniqueName: \"kubernetes.io/projected/60cdb7dd-63cf-4f28-ab2f-b58de493e006-kube-api-access-s6q7t\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444656 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444827 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-audit\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444848 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.444660 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2942a-8cfe-42b7-a339-4d7b30ee12be-config\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445040 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50bef027-1010-4814-b1de-a758f875c57d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x9tvw\" (UID: \"50bef027-1010-4814-b1de-a758f875c57d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445078 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-audit-policies\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445112 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a284d0f7-a004-45c1-9eb6-a500afacf05b-serving-cert\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445141 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445193 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-client\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445220 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b67cdcc8-e9cd-4377-89b4-bef6191828b8-serving-cert\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-policies\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445291 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445342 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445298 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d789157-4dd2-4b8e-befc-84e8c03e6da6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445386 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12d04376-4d45-4906-9772-84f7c9d313bf-trusted-ca\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-config\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-encryption-config\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445534 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675483c3-eb80-41b4-b02b-db9059ec788b-config\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445555 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t272m\" (UniqueName: \"kubernetes.io/projected/8c4101d1-244d-4f5c-b059-54b9f26c225f-kube-api-access-t272m\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445568 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445575 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445671 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltn24\" (UniqueName: \"kubernetes.io/projected/675483c3-eb80-41b4-b02b-db9059ec788b-kube-api-access-ltn24\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-config\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-etcd-client\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445787 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c4101d1-244d-4f5c-b059-54b9f26c225f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445932 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-encryption-config\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.445979 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mdt\" (UniqueName: \"kubernetes.io/projected/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-kube-api-access-k9mdt\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446004 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d270838b-a09d-4fe8-be26-3310e7989953-trusted-ca\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm5f5\" (UniqueName: \"kubernetes.io/projected/b67cdcc8-e9cd-4377-89b4-bef6191828b8-kube-api-access-qm5f5\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446046 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-config\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446071 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/675483c3-eb80-41b4-b02b-db9059ec788b-images\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446093 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74ed9556-5676-44b1-aa3c-02eb697ab0a8-audit-dir\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446112 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpgm8\" (UniqueName: \"kubernetes.io/projected/74ed9556-5676-44b1-aa3c-02eb697ab0a8-kube-api-access-dpgm8\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-ca\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446181 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-serving-cert\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/675483c3-eb80-41b4-b02b-db9059ec788b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446260 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-client-ca\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446281 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/99a2942a-8cfe-42b7-a339-4d7b30ee12be-machine-approver-tls\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5bhv\" (UniqueName: \"kubernetes.io/projected/1ad70ef2-45cd-4139-a60e-0bda62597cb9-kube-api-access-b5bhv\") pod \"downloads-7954f5f757-kjs2b\" (UID: \"1ad70ef2-45cd-4139-a60e-0bda62597cb9\") " pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446334 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c4101d1-244d-4f5c-b059-54b9f26c225f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446372 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-config\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446395 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zh7h\" (UniqueName: \"kubernetes.io/projected/d270838b-a09d-4fe8-be26-3310e7989953-kube-api-access-7zh7h\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446414 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446458 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c42468-5fc7-4a67-86d7-73c0f7589899-serving-cert\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446478 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c4101d1-244d-4f5c-b059-54b9f26c225f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446501 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d4cs\" (UniqueName: \"kubernetes.io/projected/e6c42468-5fc7-4a67-86d7-73c0f7589899-kube-api-access-5d4cs\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njf4h\" (UniqueName: \"kubernetes.io/projected/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-kube-api-access-njf4h\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446569 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-etcd-client\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446591 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4kt\" (UniqueName: \"kubernetes.io/projected/50bef027-1010-4814-b1de-a758f875c57d-kube-api-access-zq4kt\") pod \"cluster-samples-operator-665b6dd947-x9tvw\" (UID: \"50bef027-1010-4814-b1de-a758f875c57d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446635 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a2942a-8cfe-42b7-a339-4d7b30ee12be-auth-proxy-config\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446652 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwvw\" (UniqueName: \"kubernetes.io/projected/64edb413-91a3-48ab-8d24-131c2d4fecb7-kube-api-access-jcwvw\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446672 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12d04376-4d45-4906-9772-84f7c9d313bf-serving-cert\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446694 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-875bw\" (UniqueName: \"kubernetes.io/projected/a284d0f7-a004-45c1-9eb6-a500afacf05b-kube-api-access-875bw\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446716 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446736 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-service-ca\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446771 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-service-ca-bundle\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60cdb7dd-63cf-4f28-ab2f-b58de493e006-audit-dir\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rtv8\" (UniqueName: \"kubernetes.io/projected/41984359-fc99-4678-962a-b8c09f7c8e26-kube-api-access-5rtv8\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446868 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-client-ca\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446893 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-serving-cert\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446911 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqdv\" (UniqueName: \"kubernetes.io/projected/3d789157-4dd2-4b8e-befc-84e8c03e6da6-kube-api-access-xjqdv\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d270838b-a09d-4fe8-be26-3310e7989953-metrics-tls\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.446981 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60cdb7dd-63cf-4f28-ab2f-b58de493e006-node-pullsecrets\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-etcd-serving-ca\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447023 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d270838b-a09d-4fe8-be26-3310e7989953-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447069 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447089 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d789157-4dd2-4b8e-befc-84e8c03e6da6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447130 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447151 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bzg\" (UniqueName: \"kubernetes.io/projected/12d04376-4d45-4906-9772-84f7c9d313bf-kube-api-access-d7bzg\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-config\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.447460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/60cdb7dd-63cf-4f28-ab2f-b58de493e006-node-pullsecrets\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.448195 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-etcd-serving-ca\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.448794 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.448986 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.451429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-config\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.451616 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60cdb7dd-63cf-4f28-ab2f-b58de493e006-audit-dir\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.455191 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465603 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-encryption-config\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465727 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465870 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465882 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-etcd-client\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465988 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466068 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466089 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466110 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466141 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.465870 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466172 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466115 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466281 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466288 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466359 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466370 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466389 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466502 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466677 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466764 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466777 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466806 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466889 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466921 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.466944 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.467033 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.467558 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.467918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-image-import-ca\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.468008 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.480519 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.480685 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.480768 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481407 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481431 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481478 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481618 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481617 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481815 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481913 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.481947 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482030 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482132 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482195 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bj55n"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482259 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482403 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482520 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482597 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482604 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482610 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482956 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482636 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482663 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.483088 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.482698 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.483228 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.483248 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.483291 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.483474 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.483899 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.484349 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.484586 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.491350 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-94j6s"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.492039 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.494758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.495247 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60cdb7dd-63cf-4f28-ab2f-b58de493e006-serving-cert\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.495332 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.496227 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.496384 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.500807 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.503550 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dfjx8"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.505684 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.508677 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.511969 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.524861 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.525307 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.525480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60cdb7dd-63cf-4f28-ab2f-b58de493e006-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.526224 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dfwxh"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.527071 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.527581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.527848 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.531383 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.531746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k88lw"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.532846 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.532876 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.533300 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.533578 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.533897 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.534269 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.535191 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.536603 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6p2hx"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.537274 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.538852 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.540295 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.541688 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.544870 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.544389 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.545908 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.546187 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.546349 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.546749 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.546938 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.547393 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.547617 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.547667 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.547809 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.548248 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.548715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6zbx7"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.548881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c42468-5fc7-4a67-86d7-73c0f7589899-serving-cert\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.548962 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c4101d1-244d-4f5c-b059-54b9f26c225f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.548983 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549031 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d4cs\" (UniqueName: \"kubernetes.io/projected/e6c42468-5fc7-4a67-86d7-73c0f7589899-kube-api-access-5d4cs\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549049 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4kt\" (UniqueName: \"kubernetes.io/projected/50bef027-1010-4814-b1de-a758f875c57d-kube-api-access-zq4kt\") pod \"cluster-samples-operator-665b6dd947-x9tvw\" (UID: \"50bef027-1010-4814-b1de-a758f875c57d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549092 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njf4h\" (UniqueName: \"kubernetes.io/projected/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-kube-api-access-njf4h\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a2942a-8cfe-42b7-a339-4d7b30ee12be-auth-proxy-config\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcwvw\" (UniqueName: \"kubernetes.io/projected/64edb413-91a3-48ab-8d24-131c2d4fecb7-kube-api-access-jcwvw\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12d04376-4d45-4906-9772-84f7c9d313bf-serving-cert\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549270 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549287 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-service-ca\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-service-ca-bundle\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549357 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-875bw\" (UniqueName: \"kubernetes.io/projected/a284d0f7-a004-45c1-9eb6-a500afacf05b-kube-api-access-875bw\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549403 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rtv8\" (UniqueName: \"kubernetes.io/projected/41984359-fc99-4678-962a-b8c09f7c8e26-kube-api-access-5rtv8\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549421 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-client-ca\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqdv\" (UniqueName: \"kubernetes.io/projected/3d789157-4dd2-4b8e-befc-84e8c03e6da6-kube-api-access-xjqdv\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d270838b-a09d-4fe8-be26-3310e7989953-metrics-tls\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549052 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549529 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d270838b-a09d-4fe8-be26-3310e7989953-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549582 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d789157-4dd2-4b8e-befc-84e8c03e6da6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549600 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549637 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549648 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549656 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bzg\" (UniqueName: \"kubernetes.io/projected/12d04376-4d45-4906-9772-84f7c9d313bf-kube-api-access-d7bzg\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549711 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-config\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-serving-cert\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-dir\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549819 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12d04376-4d45-4906-9772-84f7c9d313bf-config\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549853 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41984359-fc99-4678-962a-b8c09f7c8e26-serving-cert\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549895 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhtf\" (UniqueName: \"kubernetes.io/projected/99a2942a-8cfe-42b7-a339-4d7b30ee12be-kube-api-access-cbhtf\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2942a-8cfe-42b7-a339-4d7b30ee12be-config\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549939 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50bef027-1010-4814-b1de-a758f875c57d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x9tvw\" (UID: \"50bef027-1010-4814-b1de-a758f875c57d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-audit-policies\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549973 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a284d0f7-a004-45c1-9eb6-a500afacf05b-serving-cert\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549990 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-client\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b67cdcc8-e9cd-4377-89b4-bef6191828b8-serving-cert\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550040 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-policies\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550054 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d789157-4dd2-4b8e-befc-84e8c03e6da6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550073 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12d04376-4d45-4906-9772-84f7c9d313bf-trusted-ca\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550089 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-config\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550106 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-encryption-config\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550125 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550144 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675483c3-eb80-41b4-b02b-db9059ec788b-config\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t272m\" (UniqueName: \"kubernetes.io/projected/8c4101d1-244d-4f5c-b059-54b9f26c225f-kube-api-access-t272m\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-config\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltn24\" (UniqueName: \"kubernetes.io/projected/675483c3-eb80-41b4-b02b-db9059ec788b-kube-api-access-ltn24\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-etcd-client\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550621 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c4101d1-244d-4f5c-b059-54b9f26c225f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550639 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mdt\" (UniqueName: \"kubernetes.io/projected/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-kube-api-access-k9mdt\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550654 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d270838b-a09d-4fe8-be26-3310e7989953-trusted-ca\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550678 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm5f5\" (UniqueName: \"kubernetes.io/projected/b67cdcc8-e9cd-4377-89b4-bef6191828b8-kube-api-access-qm5f5\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550698 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-config\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550713 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/675483c3-eb80-41b4-b02b-db9059ec788b-images\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74ed9556-5676-44b1-aa3c-02eb697ab0a8-audit-dir\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550746 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpgm8\" (UniqueName: \"kubernetes.io/projected/74ed9556-5676-44b1-aa3c-02eb697ab0a8-kube-api-access-dpgm8\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550765 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-serving-cert\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-ca\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/675483c3-eb80-41b4-b02b-db9059ec788b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-client-ca\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/99a2942a-8cfe-42b7-a339-4d7b30ee12be-machine-approver-tls\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5bhv\" (UniqueName: \"kubernetes.io/projected/1ad70ef2-45cd-4139-a60e-0bda62597cb9-kube-api-access-b5bhv\") pod \"downloads-7954f5f757-kjs2b\" (UID: \"1ad70ef2-45cd-4139-a60e-0bda62597cb9\") " pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zh7h\" (UniqueName: \"kubernetes.io/projected/d270838b-a09d-4fe8-be26-3310e7989953-kube-api-access-7zh7h\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c4101d1-244d-4f5c-b059-54b9f26c225f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.550904 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.551364 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.551371 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a2942a-8cfe-42b7-a339-4d7b30ee12be-auth-proxy-config\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.551966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-service-ca-bundle\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.551988 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-config\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.549059 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.553013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.553651 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-client-ca\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.556169 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tl2qn"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.557931 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm47m"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.558971 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kjs2b"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.558993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.560346 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.560686 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.561543 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a2942a-8cfe-42b7-a339-4d7b30ee12be-config\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.562393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c42468-5fc7-4a67-86d7-73c0f7589899-serving-cert\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.563909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-dir\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.564084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.565069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.565812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12d04376-4d45-4906-9772-84f7c9d313bf-serving-cert\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.566039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.566080 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-policies\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.567542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-config\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.570019 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d789157-4dd2-4b8e-befc-84e8c03e6da6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.570317 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.569456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-client-ca\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.572289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675483c3-eb80-41b4-b02b-db9059ec788b-config\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.573515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-audit-policies\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.573872 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12d04376-4d45-4906-9772-84f7c9d313bf-config\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.574132 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.574942 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.575483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.575648 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-config\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.575907 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c4101d1-244d-4f5c-b059-54b9f26c225f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.575963 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.575950 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b67cdcc8-e9cd-4377-89b4-bef6191828b8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.576097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12d04376-4d45-4906-9772-84f7c9d313bf-trusted-ca\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.576202 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.576987 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-etcd-client\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.577263 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74ed9556-5676-44b1-aa3c-02eb697ab0a8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.577979 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nmf8f"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.578392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d270838b-a09d-4fe8-be26-3310e7989953-trusted-ca\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.578452 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b67cdcc8-e9cd-4377-89b4-bef6191828b8-serving-cert\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.578572 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/99a2942a-8cfe-42b7-a339-4d7b30ee12be-machine-approver-tls\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.578610 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.578624 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74ed9556-5676-44b1-aa3c-02eb697ab0a8-audit-dir\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.578886 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a284d0f7-a004-45c1-9eb6-a500afacf05b-serving-cert\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.579118 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/675483c3-eb80-41b4-b02b-db9059ec788b-images\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.579300 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-ca\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.579326 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.580551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d270838b-a09d-4fe8-be26-3310e7989953-metrics-tls\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.581172 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.581615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d789157-4dd2-4b8e-befc-84e8c03e6da6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.581776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c4101d1-244d-4f5c-b059-54b9f26c225f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.583015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-serving-cert\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.583681 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50bef027-1010-4814-b1de-a758f875c57d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x9tvw\" (UID: \"50bef027-1010-4814-b1de-a758f875c57d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.583743 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7fw57"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.584302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.584555 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.584649 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-serving-cert\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.585343 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/675483c3-eb80-41b4-b02b-db9059ec788b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.585445 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6vd5d"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.586513 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.586963 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-twbvc"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.588574 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.588589 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.590014 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.591441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.591565 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.591785 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.592971 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.597512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/74ed9556-5676-44b1-aa3c-02eb697ab0a8-encryption-config\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.599050 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41984359-fc99-4678-962a-b8c09f7c8e26-serving-cert\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.600434 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8v59z"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.601656 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gdj98"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.602926 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.603137 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.604329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.605631 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.606906 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6p2hx"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.607999 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.608232 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bj55n"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.609581 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.611070 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.611878 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.612990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.614410 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.615458 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.616578 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.617708 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.617741 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-client\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.618831 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.620150 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqs75"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.621553 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tl2qn"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.623054 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6zbx7"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.624291 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xtgwh"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.625375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.625765 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.626900 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.627934 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.628481 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm47m"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.630308 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.631580 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xtgwh"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.632559 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdj98"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.633499 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-59w7l"] Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.633878 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.637315 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-etcd-service-ca\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.648705 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.667810 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.688186 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.692984 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41984359-fc99-4678-962a-b8c09f7c8e26-config\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.709169 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.728898 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.785524 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6q7t\" (UniqueName: \"kubernetes.io/projected/60cdb7dd-63cf-4f28-ab2f-b58de493e006-kube-api-access-s6q7t\") pod \"apiserver-76f77b778f-k88lw\" (UID: \"60cdb7dd-63cf-4f28-ab2f-b58de493e006\") " pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.789354 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.808622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.828885 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.848487 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.889288 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.909566 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.929807 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.936914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.949862 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.969542 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 14 13:03:21 crc kubenswrapper[4837]: I1014 13:03:21.990597 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.010595 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.031388 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.049802 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.070037 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.090211 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.110073 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.133014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.150522 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.169907 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.189715 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.210196 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.229311 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.249566 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.269387 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.289380 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.309142 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.329362 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.349454 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.369358 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.389913 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.409251 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.429449 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.450356 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.470505 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.480058 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k88lw"] Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.490434 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: W1014 13:03:22.496380 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60cdb7dd_63cf_4f28_ab2f_b58de493e006.slice/crio-6c5189e02a9949fdddc5b383b3f56c1098d7a3605e298ef1e6747192800993ed WatchSource:0}: Error finding container 6c5189e02a9949fdddc5b383b3f56c1098d7a3605e298ef1e6747192800993ed: Status 404 returned error can't find the container with id 6c5189e02a9949fdddc5b383b3f56c1098d7a3605e298ef1e6747192800993ed Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.509411 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.550512 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.551251 4837 request.go:700] Waited for 1.011354012s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.553660 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.555734 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" event={"ID":"60cdb7dd-63cf-4f28-ab2f-b58de493e006","Type":"ContainerStarted","Data":"6c5189e02a9949fdddc5b383b3f56c1098d7a3605e298ef1e6747192800993ed"} Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.569823 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.589124 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.609476 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.630045 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.648742 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.669567 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.690519 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.709669 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.729533 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.749226 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.769687 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.790371 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.809564 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.829750 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.850532 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.869335 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.919789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c4101d1-244d-4f5c-b059-54b9f26c225f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.938357 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcwvw\" (UniqueName: \"kubernetes.io/projected/64edb413-91a3-48ab-8d24-131c2d4fecb7-kube-api-access-jcwvw\") pod \"oauth-openshift-558db77b4-8v59z\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.957263 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d4cs\" (UniqueName: \"kubernetes.io/projected/e6c42468-5fc7-4a67-86d7-73c0f7589899-kube-api-access-5d4cs\") pod \"controller-manager-879f6c89f-dfwxh\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.977353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4kt\" (UniqueName: \"kubernetes.io/projected/50bef027-1010-4814-b1de-a758f875c57d-kube-api-access-zq4kt\") pod \"cluster-samples-operator-665b6dd947-x9tvw\" (UID: \"50bef027-1010-4814-b1de-a758f875c57d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:22 crc kubenswrapper[4837]: I1014 13:03:22.997393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njf4h\" (UniqueName: \"kubernetes.io/projected/2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca-kube-api-access-njf4h\") pod \"openshift-apiserver-operator-796bbdcf4f-652k8\" (UID: \"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.007397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bzg\" (UniqueName: \"kubernetes.io/projected/12d04376-4d45-4906-9772-84f7c9d313bf-kube-api-access-d7bzg\") pod \"console-operator-58897d9998-nmf8f\" (UID: \"12d04376-4d45-4906-9772-84f7c9d313bf\") " pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.034936 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rtv8\" (UniqueName: \"kubernetes.io/projected/41984359-fc99-4678-962a-b8c09f7c8e26-kube-api-access-5rtv8\") pod \"etcd-operator-b45778765-7fw57\" (UID: \"41984359-fc99-4678-962a-b8c09f7c8e26\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.049071 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.055573 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-875bw\" (UniqueName: \"kubernetes.io/projected/a284d0f7-a004-45c1-9eb6-a500afacf05b-kube-api-access-875bw\") pod \"route-controller-manager-6576b87f9c-vnpkw\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.069621 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.076592 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhtf\" (UniqueName: \"kubernetes.io/projected/99a2942a-8cfe-42b7-a339-4d7b30ee12be-kube-api-access-cbhtf\") pod \"machine-approver-56656f9798-6r2j7\" (UID: \"99a2942a-8cfe-42b7-a339-4d7b30ee12be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.089805 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.106184 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.109833 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.130080 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.132304 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.171896 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.173553 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.189045 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.194339 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.195716 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.207810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.209904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.218890 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:23 crc kubenswrapper[4837]: W1014 13:03:23.224493 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a2942a_8cfe_42b7_a339_4d7b30ee12be.slice/crio-22f47e7c4c57aae6bada05d4579d446297cb623628e4f6ed32dd5906cbb00be2 WatchSource:0}: Error finding container 22f47e7c4c57aae6bada05d4579d446297cb623628e4f6ed32dd5906cbb00be2: Status 404 returned error can't find the container with id 22f47e7c4c57aae6bada05d4579d446297cb623628e4f6ed32dd5906cbb00be2 Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.229637 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.256773 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.285810 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t272m\" (UniqueName: \"kubernetes.io/projected/8c4101d1-244d-4f5c-b059-54b9f26c225f-kube-api-access-t272m\") pod \"cluster-image-registry-operator-dc59b4c8b-c5zxj\" (UID: \"8c4101d1-244d-4f5c-b059-54b9f26c225f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.297612 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.298421 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltn24\" (UniqueName: \"kubernetes.io/projected/675483c3-eb80-41b4-b02b-db9059ec788b-kube-api-access-ltn24\") pod \"machine-api-operator-5694c8668f-dfjx8\" (UID: \"675483c3-eb80-41b4-b02b-db9059ec788b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.302265 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.308878 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.329748 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.347848 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.349626 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.369027 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: W1014 13:03:23.399320 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f32f8e3_1b2f_4bf9_93e6_c7c649a97dca.slice/crio-9a5a796f6b48121ad45fc6844f981692d7b5853f192858ae20f85f3f9e55d2fb WatchSource:0}: Error finding container 9a5a796f6b48121ad45fc6844f981692d7b5853f192858ae20f85f3f9e55d2fb: Status 404 returned error can't find the container with id 9a5a796f6b48121ad45fc6844f981692d7b5853f192858ae20f85f3f9e55d2fb Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.404884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqdv\" (UniqueName: \"kubernetes.io/projected/3d789157-4dd2-4b8e-befc-84e8c03e6da6-kube-api-access-xjqdv\") pod \"openshift-controller-manager-operator-756b6f6bc6-dcmj4\" (UID: \"3d789157-4dd2-4b8e-befc-84e8c03e6da6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.426502 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mdt\" (UniqueName: \"kubernetes.io/projected/9653fbf6-7b49-40eb-b8af-1c89f9ed3e88-kube-api-access-k9mdt\") pod \"openshift-config-operator-7777fb866f-tvv2p\" (UID: \"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.450191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5bhv\" (UniqueName: \"kubernetes.io/projected/1ad70ef2-45cd-4139-a60e-0bda62597cb9-kube-api-access-b5bhv\") pod \"downloads-7954f5f757-kjs2b\" (UID: \"1ad70ef2-45cd-4139-a60e-0bda62597cb9\") " pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.455259 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.467626 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zh7h\" (UniqueName: \"kubernetes.io/projected/d270838b-a09d-4fe8-be26-3310e7989953-kube-api-access-7zh7h\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.484840 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm5f5\" (UniqueName: \"kubernetes.io/projected/b67cdcc8-e9cd-4377-89b4-bef6191828b8-kube-api-access-qm5f5\") pod \"authentication-operator-69f744f599-twbvc\" (UID: \"b67cdcc8-e9cd-4377-89b4-bef6191828b8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.512747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpgm8\" (UniqueName: \"kubernetes.io/projected/74ed9556-5676-44b1-aa3c-02eb697ab0a8-kube-api-access-dpgm8\") pod \"apiserver-7bbb656c7d-fq9rr\" (UID: \"74ed9556-5676-44b1-aa3c-02eb697ab0a8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.521765 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d270838b-a09d-4fe8-be26-3310e7989953-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n6d46\" (UID: \"d270838b-a09d-4fe8-be26-3310e7989953\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.534386 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.536424 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nmf8f"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.543069 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.548697 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 14 13:03:23 crc kubenswrapper[4837]: W1014 13:03:23.562198 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12d04376_4d45_4906_9772_84f7c9d313bf.slice/crio-4443fb178000b8a78ef5eb29ffedfb81d6ba49aecad7fda8d046af7496738389 WatchSource:0}: Error finding container 4443fb178000b8a78ef5eb29ffedfb81d6ba49aecad7fda8d046af7496738389: Status 404 returned error can't find the container with id 4443fb178000b8a78ef5eb29ffedfb81d6ba49aecad7fda8d046af7496738389 Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.564590 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" event={"ID":"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca","Type":"ContainerStarted","Data":"9a5a796f6b48121ad45fc6844f981692d7b5853f192858ae20f85f3f9e55d2fb"} Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.567633 4837 request.go:700] Waited for 1.964510813s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.568721 4837 generic.go:334] "Generic (PLEG): container finished" podID="60cdb7dd-63cf-4f28-ab2f-b58de493e006" containerID="05940ce7d3bed564818f5604363317cedfa7145aede70082ac33575bd4be6915" exitCode=0 Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.568809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" event={"ID":"60cdb7dd-63cf-4f28-ab2f-b58de493e006","Type":"ContainerDied","Data":"05940ce7d3bed564818f5604363317cedfa7145aede70082ac33575bd4be6915"} Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.569039 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.572282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" event={"ID":"99a2942a-8cfe-42b7-a339-4d7b30ee12be","Type":"ContainerStarted","Data":"22f47e7c4c57aae6bada05d4579d446297cb623628e4f6ed32dd5906cbb00be2"} Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.592096 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.607279 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.609032 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.616092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.617081 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.628891 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.629167 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.635908 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.648907 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.669017 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.678458 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8v59z"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.679645 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7fw57"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.684323 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dfwxh"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.689350 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.690389 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.710739 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.725296 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778683 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smdnn\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-kube-api-access-smdnn\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-bound-sa-token\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778769 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-oauth-config\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778801 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-console-config\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778827 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-oauth-serving-cert\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-trusted-ca-bundle\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778877 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14165edd-b69a-4886-8405-09298571b47b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.778895 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-registry-certificates\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: E1014 13:03:23.779044 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.279030199 +0000 UTC m=+142.196030012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.782900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-registry-tls\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.783133 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-serving-cert\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.783178 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-service-ca\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.783225 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-trusted-ca\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.783246 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14165edd-b69a-4886-8405-09298571b47b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.783265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz9rl\" (UniqueName: \"kubernetes.io/projected/fb47e83f-903a-4420-9741-645bbbdf63c4-kube-api-access-gz9rl\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.808303 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-twbvc"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.814609 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:23 crc kubenswrapper[4837]: W1014 13:03:23.866417 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67cdcc8_e9cd_4377_89b4_bef6191828b8.slice/crio-ea708925472ff5df4b27e2666c604b95b3a508aa281befc559e6c8eebf075767 WatchSource:0}: Error finding container ea708925472ff5df4b27e2666c604b95b3a508aa281befc559e6c8eebf075767: Status 404 returned error can't find the container with id ea708925472ff5df4b27e2666c604b95b3a508aa281befc559e6c8eebf075767 Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884137 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-oauth-config\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884368 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-plugins-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884391 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-console-config\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884508 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/13b89429-bd07-413e-9436-fe6f28d882ff-srv-cert\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884528 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/708198b4-2426-4e20-8731-0cdbf6083496-tmpfs\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sfxp\" (UniqueName: \"kubernetes.io/projected/2e342172-b3e8-4e3b-b4a5-0d050095e20a-kube-api-access-6sfxp\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884558 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-signing-cabundle\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884572 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfj8\" (UniqueName: \"kubernetes.io/projected/13b89429-bd07-413e-9436-fe6f28d882ff-kube-api-access-5kfj8\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-oauth-serving-cert\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884635 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kpvk\" (UniqueName: \"kubernetes.io/projected/05a8461c-66c0-46a5-82cd-c0f075fd5842-kube-api-access-4kpvk\") pod \"dns-operator-744455d44c-bj55n\" (UID: \"05a8461c-66c0-46a5-82cd-c0f075fd5842\") " pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884708 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-certs\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884724 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcxhq\" (UniqueName: \"kubernetes.io/projected/341ec536-bf38-4225-a0bc-da7f4837cdbc-kube-api-access-dcxhq\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/13b89429-bd07-413e-9436-fe6f28d882ff-profile-collector-cert\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-node-bootstrap-token\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884799 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/708198b4-2426-4e20-8731-0cdbf6083496-apiservice-cert\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-registry-tls\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884875 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a3a1d7-2659-42f9-92c3-2086d0ab27f7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p4hjj\" (UID: \"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884895 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spkrj\" (UniqueName: \"kubernetes.io/projected/d18a3baf-1b3a-4822-aa24-47bb7cda4725-kube-api-access-spkrj\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884914 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-stats-auth\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14165edd-b69a-4886-8405-09298571b47b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgk5\" (UniqueName: \"kubernetes.io/projected/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-kube-api-access-fdgk5\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.884964 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-config\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz9rl\" (UniqueName: \"kubernetes.io/projected/fb47e83f-903a-4420-9741-645bbbdf63c4-kube-api-access-gz9rl\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885034 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1577b547-7e30-4b8e-9959-fdd88088041c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wzp95\" (UID: \"1577b547-7e30-4b8e-9959-fdd88088041c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885055 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aeb6baa7-a962-4526-bb66-5907ac7c0141-profile-collector-cert\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-images\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885092 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7f932ac-bd67-48e8-9f8d-b90218acaeda-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885106 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81ca45e-66f6-49c7-9963-b75b6d87c91f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885130 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-socket-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885145 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-registration-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885173 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885189 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-default-certificate\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885207 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-secret-volume\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885256 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-bound-sa-token\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: E1014 13:03:23.885280 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.385259786 +0000 UTC m=+142.302259599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885359 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzgh\" (UniqueName: \"kubernetes.io/projected/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-kube-api-access-2tzgh\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-mountpoint-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885414 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/708198b4-2426-4e20-8731-0cdbf6083496-webhook-cert\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885430 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/341ec536-bf38-4225-a0bc-da7f4837cdbc-service-ca-bundle\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885466 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/926381ee-78f6-4811-9695-0ebf216b3d8b-cert\") pod \"ingress-canary-gdj98\" (UID: \"926381ee-78f6-4811-9695-0ebf216b3d8b\") " pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885524 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv4v2\" (UniqueName: \"kubernetes.io/projected/048a43ae-98e0-489b-9ef4-c63f44881fa0-kube-api-access-nv4v2\") pod \"migrator-59844c95c7-g98dl\" (UID: \"048a43ae-98e0-489b-9ef4-c63f44881fa0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-serving-cert\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885654 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885673 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f932ac-bd67-48e8-9f8d-b90218acaeda-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885690 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-proxy-tls\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885707 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18a3baf-1b3a-4822-aa24-47bb7cda4725-config-volume\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-trusted-ca-bundle\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885764 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885803 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkk8\" (UniqueName: \"kubernetes.io/projected/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-kube-api-access-5bkk8\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885838 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14165edd-b69a-4886-8405-09298571b47b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-signing-key\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885912 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvs8x\" (UniqueName: \"kubernetes.io/projected/a8a3a1d7-2659-42f9-92c3-2086d0ab27f7-kube-api-access-kvs8x\") pod \"package-server-manager-789f6589d5-p4hjj\" (UID: \"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885931 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d18a3baf-1b3a-4822-aa24-47bb7cda4725-metrics-tls\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885945 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-metrics-certs\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.885980 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-registry-certificates\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.886016 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfb34230-e125-4ed5-86dc-f6bc57bb7f51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6p2hx\" (UID: \"bfb34230-e125-4ed5-86dc-f6bc57bb7f51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.886048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l484d\" (UniqueName: \"kubernetes.io/projected/bfb34230-e125-4ed5-86dc-f6bc57bb7f51-kube-api-access-l484d\") pod \"multus-admission-controller-857f4d67dd-6p2hx\" (UID: \"bfb34230-e125-4ed5-86dc-f6bc57bb7f51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.886083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-config\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.886127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b81ca45e-66f6-49c7-9963-b75b6d87c91f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.888420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-trusted-ca-bundle\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.888942 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-oauth-serving-cert\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.889910 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-registry-certificates\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.889965 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-console-config\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.890588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14165edd-b69a-4886-8405-09298571b47b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.892810 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-registry-tls\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.894599 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-oauth-config\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.896850 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dfjx8"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.897631 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aeb6baa7-a962-4526-bb66-5907ac7c0141-srv-cert\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.897670 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e342172-b3e8-4e3b-b4a5-0d050095e20a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.897739 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv8dt\" (UniqueName: \"kubernetes.io/projected/1577b547-7e30-4b8e-9959-fdd88088041c-kube-api-access-wv8dt\") pod \"control-plane-machine-set-operator-78cbb6b69f-wzp95\" (UID: \"1577b547-7e30-4b8e-9959-fdd88088041c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.897805 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kghm2\" (UniqueName: \"kubernetes.io/projected/708198b4-2426-4e20-8731-0cdbf6083496-kube-api-access-kghm2\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.897821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898503 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-serving-cert\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14165edd-b69a-4886-8405-09298571b47b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898525 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcxc\" (UniqueName: \"kubernetes.io/projected/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-kube-api-access-jrcxc\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-trusted-ca\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-service-ca\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sllmg\" (UniqueName: \"kubernetes.io/projected/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-kube-api-access-sllmg\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898941 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-csi-data-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.898958 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05a8461c-66c0-46a5-82cd-c0f075fd5842-metrics-tls\") pod \"dns-operator-744455d44c-bj55n\" (UID: \"05a8461c-66c0-46a5-82cd-c0f075fd5842\") " pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899051 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsxv\" (UniqueName: \"kubernetes.io/projected/aeb6baa7-a962-4526-bb66-5907ac7c0141-kube-api-access-4bsxv\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899070 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899186 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f932ac-bd67-48e8-9f8d-b90218acaeda-config\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899206 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81ca45e-66f6-49c7-9963-b75b6d87c91f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899222 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e342172-b3e8-4e3b-b4a5-0d050095e20a-proxy-tls\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4rf\" (UniqueName: \"kubernetes.io/projected/0e168cef-fe99-471f-89db-34290cbb6639-kube-api-access-ft4rf\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899359 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-config-volume\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smdnn\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-kube-api-access-smdnn\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899439 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgm8k\" (UniqueName: \"kubernetes.io/projected/926381ee-78f6-4811-9695-0ebf216b3d8b-kube-api-access-hgm8k\") pod \"ingress-canary-gdj98\" (UID: \"926381ee-78f6-4811-9695-0ebf216b3d8b\") " pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899456 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65678\" (UniqueName: \"kubernetes.io/projected/3b22ce4d-5f14-40e9-943f-8368b104a4b9-kube-api-access-65678\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.899555 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2h2h\" (UniqueName: \"kubernetes.io/projected/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-kube-api-access-b2h2h\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.900183 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-service-ca\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.903640 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-trusted-ca\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.904312 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-serving-cert\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.924264 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-bound-sa-token\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:23 crc kubenswrapper[4837]: W1014 13:03:23.927508 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9653fbf6_7b49_40eb_b8af_1c89f9ed3e88.slice/crio-8f9ab450af9ee2cb65f337493c1391d5722d133e6becf5a3085879079571439b WatchSource:0}: Error finding container 8f9ab450af9ee2cb65f337493c1391d5722d133e6becf5a3085879079571439b: Status 404 returned error can't find the container with id 8f9ab450af9ee2cb65f337493c1391d5722d133e6becf5a3085879079571439b Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.940386 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.943656 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz9rl\" (UniqueName: \"kubernetes.io/projected/fb47e83f-903a-4420-9741-645bbbdf63c4-kube-api-access-gz9rl\") pod \"console-f9d7485db-6vd5d\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:23 crc kubenswrapper[4837]: W1014 13:03:23.976487 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d789157_4dd2_4b8e_befc_84e8c03e6da6.slice/crio-ac0c86a0f316dbe5aca86b8ba87c0881cfe006a0c53e394fbeeff54aeac08d9d WatchSource:0}: Error finding container ac0c86a0f316dbe5aca86b8ba87c0881cfe006a0c53e394fbeeff54aeac08d9d: Status 404 returned error can't find the container with id ac0c86a0f316dbe5aca86b8ba87c0881cfe006a0c53e394fbeeff54aeac08d9d Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.983082 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kjs2b"] Oct 14 13:03:23 crc kubenswrapper[4837]: I1014 13:03:23.986648 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smdnn\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-kube-api-access-smdnn\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sllmg\" (UniqueName: \"kubernetes.io/projected/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-kube-api-access-sllmg\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002625 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-csi-data-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05a8461c-66c0-46a5-82cd-c0f075fd5842-metrics-tls\") pod \"dns-operator-744455d44c-bj55n\" (UID: \"05a8461c-66c0-46a5-82cd-c0f075fd5842\") " pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002700 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsxv\" (UniqueName: \"kubernetes.io/projected/aeb6baa7-a962-4526-bb66-5907ac7c0141-kube-api-access-4bsxv\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f932ac-bd67-48e8-9f8d-b90218acaeda-config\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81ca45e-66f6-49c7-9963-b75b6d87c91f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002832 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e342172-b3e8-4e3b-b4a5-0d050095e20a-proxy-tls\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4rf\" (UniqueName: \"kubernetes.io/projected/0e168cef-fe99-471f-89db-34290cbb6639-kube-api-access-ft4rf\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-config-volume\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002928 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgm8k\" (UniqueName: \"kubernetes.io/projected/926381ee-78f6-4811-9695-0ebf216b3d8b-kube-api-access-hgm8k\") pod \"ingress-canary-gdj98\" (UID: \"926381ee-78f6-4811-9695-0ebf216b3d8b\") " pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.002948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65678\" (UniqueName: \"kubernetes.io/projected/3b22ce4d-5f14-40e9-943f-8368b104a4b9-kube-api-access-65678\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003004 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2h2h\" (UniqueName: \"kubernetes.io/projected/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-kube-api-access-b2h2h\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003071 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-plugins-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/13b89429-bd07-413e-9436-fe6f28d882ff-srv-cert\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/708198b4-2426-4e20-8731-0cdbf6083496-tmpfs\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sfxp\" (UniqueName: \"kubernetes.io/projected/2e342172-b3e8-4e3b-b4a5-0d050095e20a-kube-api-access-6sfxp\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-signing-cabundle\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003276 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfj8\" (UniqueName: \"kubernetes.io/projected/13b89429-bd07-413e-9436-fe6f28d882ff-kube-api-access-5kfj8\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kpvk\" (UniqueName: \"kubernetes.io/projected/05a8461c-66c0-46a5-82cd-c0f075fd5842-kube-api-access-4kpvk\") pod \"dns-operator-744455d44c-bj55n\" (UID: \"05a8461c-66c0-46a5-82cd-c0f075fd5842\") " pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003358 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-certs\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcxhq\" (UniqueName: \"kubernetes.io/projected/341ec536-bf38-4225-a0bc-da7f4837cdbc-kube-api-access-dcxhq\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/13b89429-bd07-413e-9436-fe6f28d882ff-profile-collector-cert\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-node-bootstrap-token\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003607 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/708198b4-2426-4e20-8731-0cdbf6083496-apiservice-cert\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a3a1d7-2659-42f9-92c3-2086d0ab27f7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p4hjj\" (UID: \"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003695 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkrj\" (UniqueName: \"kubernetes.io/projected/d18a3baf-1b3a-4822-aa24-47bb7cda4725-kube-api-access-spkrj\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003744 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-stats-auth\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.005486 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-config-volume\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006056 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgk5\" (UniqueName: \"kubernetes.io/projected/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-kube-api-access-fdgk5\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7f932ac-bd67-48e8-9f8d-b90218acaeda-config\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006090 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-config\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006215 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1577b547-7e30-4b8e-9959-fdd88088041c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wzp95\" (UID: \"1577b547-7e30-4b8e-9959-fdd88088041c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006492 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aeb6baa7-a962-4526-bb66-5907ac7c0141-profile-collector-cert\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-images\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7f932ac-bd67-48e8-9f8d-b90218acaeda-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81ca45e-66f6-49c7-9963-b75b6d87c91f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.006893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-plugins-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.003449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-csi-data-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.007014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-socket-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.007218 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.507199786 +0000 UTC m=+142.424199609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.007298 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-registration-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.007533 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.007574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-default-certificate\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.007921 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05a8461c-66c0-46a5-82cd-c0f075fd5842-metrics-tls\") pod \"dns-operator-744455d44c-bj55n\" (UID: \"05a8461c-66c0-46a5-82cd-c0f075fd5842\") " pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008083 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-secret-volume\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzgh\" (UniqueName: \"kubernetes.io/projected/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-kube-api-access-2tzgh\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-mountpoint-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008213 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/708198b4-2426-4e20-8731-0cdbf6083496-webhook-cert\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/341ec536-bf38-4225-a0bc-da7f4837cdbc-service-ca-bundle\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/926381ee-78f6-4811-9695-0ebf216b3d8b-cert\") pod \"ingress-canary-gdj98\" (UID: \"926381ee-78f6-4811-9695-0ebf216b3d8b\") " pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv4v2\" (UniqueName: \"kubernetes.io/projected/048a43ae-98e0-489b-9ef4-c63f44881fa0-kube-api-access-nv4v2\") pod \"migrator-59844c95c7-g98dl\" (UID: \"048a43ae-98e0-489b-9ef4-c63f44881fa0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-serving-cert\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.008966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f932ac-bd67-48e8-9f8d-b90218acaeda-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009106 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-proxy-tls\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009137 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18a3baf-1b3a-4822-aa24-47bb7cda4725-config-volume\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkk8\" (UniqueName: \"kubernetes.io/projected/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-kube-api-access-5bkk8\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-signing-key\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009645 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvs8x\" (UniqueName: \"kubernetes.io/projected/a8a3a1d7-2659-42f9-92c3-2086d0ab27f7-kube-api-access-kvs8x\") pod \"package-server-manager-789f6589d5-p4hjj\" (UID: \"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d18a3baf-1b3a-4822-aa24-47bb7cda4725-metrics-tls\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009827 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/341ec536-bf38-4225-a0bc-da7f4837cdbc-service-ca-bundle\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009845 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-metrics-certs\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81ca45e-66f6-49c7-9963-b75b6d87c91f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.010029 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfb34230-e125-4ed5-86dc-f6bc57bb7f51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6p2hx\" (UID: \"bfb34230-e125-4ed5-86dc-f6bc57bb7f51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.010233 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l484d\" (UniqueName: \"kubernetes.io/projected/bfb34230-e125-4ed5-86dc-f6bc57bb7f51-kube-api-access-l484d\") pod \"multus-admission-controller-857f4d67dd-6p2hx\" (UID: \"bfb34230-e125-4ed5-86dc-f6bc57bb7f51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.010731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-config\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.010765 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b81ca45e-66f6-49c7-9963-b75b6d87c91f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.011026 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aeb6baa7-a962-4526-bb66-5907ac7c0141-srv-cert\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.011057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e342172-b3e8-4e3b-b4a5-0d050095e20a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.011355 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv8dt\" (UniqueName: \"kubernetes.io/projected/1577b547-7e30-4b8e-9959-fdd88088041c-kube-api-access-wv8dt\") pod \"control-plane-machine-set-operator-78cbb6b69f-wzp95\" (UID: \"1577b547-7e30-4b8e-9959-fdd88088041c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.011383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kghm2\" (UniqueName: \"kubernetes.io/projected/708198b4-2426-4e20-8731-0cdbf6083496-kube-api-access-kghm2\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.011446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.011522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.009140 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.012035 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcxc\" (UniqueName: \"kubernetes.io/projected/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-kube-api-access-jrcxc\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.012049 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aeb6baa7-a962-4526-bb66-5907ac7c0141-profile-collector-cert\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.014027 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-config\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.014341 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-mountpoint-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.015068 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-signing-key\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.015115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-socket-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.015295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a3a1d7-2659-42f9-92c3-2086d0ab27f7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p4hjj\" (UID: \"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.018006 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/708198b4-2426-4e20-8731-0cdbf6083496-tmpfs\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.020081 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.020126 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-images\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.020559 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-config\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.021794 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.023581 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.027035 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-signing-cabundle\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.028082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b22ce4d-5f14-40e9-943f-8368b104a4b9-registration-dir\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.032226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e342172-b3e8-4e3b-b4a5-0d050095e20a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.039588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-default-certificate\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.039623 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/13b89429-bd07-413e-9436-fe6f28d882ff-srv-cert\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.040118 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/13b89429-bd07-413e-9436-fe6f28d882ff-profile-collector-cert\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.040235 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-stats-auth\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.040617 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-certs\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.040629 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18a3baf-1b3a-4822-aa24-47bb7cda4725-config-volume\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.040843 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.041437 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d18a3baf-1b3a-4822-aa24-47bb7cda4725-metrics-tls\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.041606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/341ec536-bf38-4225-a0bc-da7f4837cdbc-metrics-certs\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.041851 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aeb6baa7-a962-4526-bb66-5907ac7c0141-srv-cert\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.041851 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-proxy-tls\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.042219 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfb34230-e125-4ed5-86dc-f6bc57bb7f51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6p2hx\" (UID: \"bfb34230-e125-4ed5-86dc-f6bc57bb7f51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.043602 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.048124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1577b547-7e30-4b8e-9959-fdd88088041c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wzp95\" (UID: \"1577b547-7e30-4b8e-9959-fdd88088041c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.048236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81ca45e-66f6-49c7-9963-b75b6d87c91f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.057699 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e342172-b3e8-4e3b-b4a5-0d050095e20a-proxy-tls\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.059852 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f932ac-bd67-48e8-9f8d-b90218acaeda-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.066498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/708198b4-2426-4e20-8731-0cdbf6083496-webhook-cert\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.068149 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-node-bootstrap-token\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.081799 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/708198b4-2426-4e20-8731-0cdbf6083496-apiservice-cert\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.083533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4rf\" (UniqueName: \"kubernetes.io/projected/0e168cef-fe99-471f-89db-34290cbb6639-kube-api-access-ft4rf\") pod \"marketplace-operator-79b997595-sm47m\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.083628 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-secret-volume\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.084413 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/926381ee-78f6-4811-9695-0ebf216b3d8b-cert\") pod \"ingress-canary-gdj98\" (UID: \"926381ee-78f6-4811-9695-0ebf216b3d8b\") " pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.086582 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-serving-cert\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.098400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sllmg\" (UniqueName: \"kubernetes.io/projected/ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe-kube-api-access-sllmg\") pod \"service-ca-9c57cc56f-6zbx7\" (UID: \"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe\") " pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.104862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfj8\" (UniqueName: \"kubernetes.io/projected/13b89429-bd07-413e-9436-fe6f28d882ff-kube-api-access-5kfj8\") pod \"catalog-operator-68c6474976-tkgxv\" (UID: \"13b89429-bd07-413e-9436-fe6f28d882ff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.104961 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgm8k\" (UniqueName: \"kubernetes.io/projected/926381ee-78f6-4811-9695-0ebf216b3d8b-kube-api-access-hgm8k\") pod \"ingress-canary-gdj98\" (UID: \"926381ee-78f6-4811-9695-0ebf216b3d8b\") " pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.113559 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.113721 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.613697739 +0000 UTC m=+142.530697552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.113833 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.114118 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.61411027 +0000 UTC m=+142.531110083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.132013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65678\" (UniqueName: \"kubernetes.io/projected/3b22ce4d-5f14-40e9-943f-8368b104a4b9-kube-api-access-65678\") pod \"csi-hostpathplugin-xtgwh\" (UID: \"3b22ce4d-5f14-40e9-943f-8368b104a4b9\") " pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.142616 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2h2h\" (UniqueName: \"kubernetes.io/projected/7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6-kube-api-access-b2h2h\") pod \"kube-storage-version-migrator-operator-b67b599dd-rtlpl\" (UID: \"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.152975 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.159581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.163046 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcxhq\" (UniqueName: \"kubernetes.io/projected/341ec536-bf38-4225-a0bc-da7f4837cdbc-kube-api-access-dcxhq\") pod \"router-default-5444994796-94j6s\" (UID: \"341ec536-bf38-4225-a0bc-da7f4837cdbc\") " pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.177873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.181053 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.183451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kpvk\" (UniqueName: \"kubernetes.io/projected/05a8461c-66c0-46a5-82cd-c0f075fd5842-kube-api-access-4kpvk\") pod \"dns-operator-744455d44c-bj55n\" (UID: \"05a8461c-66c0-46a5-82cd-c0f075fd5842\") " pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.186361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdj98" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.207605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c00ce0-01af-4442-ad19-cc2ab6ff94dc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-24l9g\" (UID: \"17c00ce0-01af-4442-ad19-cc2ab6ff94dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.212375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.217366 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.217507 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.71748585 +0000 UTC m=+142.634485663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.217719 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.218048 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.718035666 +0000 UTC m=+142.635035469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.227534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcxc\" (UniqueName: \"kubernetes.io/projected/4d1b67fb-138a-4071-a0ce-c646e0ed7e7e-kube-api-access-jrcxc\") pod \"machine-config-server-59w7l\" (UID: \"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e\") " pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.246806 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsxv\" (UniqueName: \"kubernetes.io/projected/aeb6baa7-a962-4526-bb66-5907ac7c0141-kube-api-access-4bsxv\") pod \"olm-operator-6b444d44fb-td9k8\" (UID: \"aeb6baa7-a962-4526-bb66-5907ac7c0141\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.270268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7f932ac-bd67-48e8-9f8d-b90218acaeda-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mfkfl\" (UID: \"b7f932ac-bd67-48e8-9f8d-b90218acaeda\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.288942 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvs8x\" (UniqueName: \"kubernetes.io/projected/a8a3a1d7-2659-42f9-92c3-2086d0ab27f7-kube-api-access-kvs8x\") pod \"package-server-manager-789f6589d5-p4hjj\" (UID: \"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.310359 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spkrj\" (UniqueName: \"kubernetes.io/projected/d18a3baf-1b3a-4822-aa24-47bb7cda4725-kube-api-access-spkrj\") pod \"dns-default-tl2qn\" (UID: \"d18a3baf-1b3a-4822-aa24-47bb7cda4725\") " pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.322140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.324310 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.824285682 +0000 UTC m=+142.741285495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.324748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.325119 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.825107605 +0000 UTC m=+142.742107418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.331085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b81ca45e-66f6-49c7-9963-b75b6d87c91f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bbljb\" (UID: \"b81ca45e-66f6-49c7-9963-b75b6d87c91f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.340199 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.346177 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.347204 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.351644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzgh\" (UniqueName: \"kubernetes.io/projected/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-kube-api-access-2tzgh\") pod \"collect-profiles-29340780-cjt2z\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.355688 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.366219 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.373472 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.380791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.387252 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sfxp\" (UniqueName: \"kubernetes.io/projected/2e342172-b3e8-4e3b-b4a5-0d050095e20a-kube-api-access-6sfxp\") pod \"machine-config-controller-84d6567774-xplrd\" (UID: \"2e342172-b3e8-4e3b-b4a5-0d050095e20a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.388525 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.392858 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv8dt\" (UniqueName: \"kubernetes.io/projected/1577b547-7e30-4b8e-9959-fdd88088041c-kube-api-access-wv8dt\") pod \"control-plane-machine-set-operator-78cbb6b69f-wzp95\" (UID: \"1577b547-7e30-4b8e-9959-fdd88088041c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.402115 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.408822 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.410169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kghm2\" (UniqueName: \"kubernetes.io/projected/708198b4-2426-4e20-8731-0cdbf6083496-kube-api-access-kghm2\") pod \"packageserver-d55dfcdfc-h56db\" (UID: \"708198b4-2426-4e20-8731-0cdbf6083496\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.419424 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6vd5d"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.425333 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.425626 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:24.925611257 +0000 UTC m=+142.842611070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.425660 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.434065 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.438282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.442770 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv4v2\" (UniqueName: \"kubernetes.io/projected/048a43ae-98e0-489b-9ef4-c63f44881fa0-kube-api-access-nv4v2\") pod \"migrator-59844c95c7-g98dl\" (UID: \"048a43ae-98e0-489b-9ef4-c63f44881fa0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.471398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkk8\" (UniqueName: \"kubernetes.io/projected/9a06e01d-de06-4ec8-90b7-3dd1e1d3515e-kube-api-access-5bkk8\") pod \"service-ca-operator-777779d784-k7zhr\" (UID: \"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.478293 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.493941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l484d\" (UniqueName: \"kubernetes.io/projected/bfb34230-e125-4ed5-86dc-f6bc57bb7f51-kube-api-access-l484d\") pod \"multus-admission-controller-857f4d67dd-6p2hx\" (UID: \"bfb34230-e125-4ed5-86dc-f6bc57bb7f51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.510266 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgk5\" (UniqueName: \"kubernetes.io/projected/7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845-kube-api-access-fdgk5\") pod \"machine-config-operator-74547568cd-8s7zf\" (UID: \"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.512248 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-59w7l" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.528107 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.529839 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.029826879 +0000 UTC m=+142.946826692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.603117 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kjs2b" event={"ID":"1ad70ef2-45cd-4139-a60e-0bda62597cb9","Type":"ContainerStarted","Data":"ae4b592b44f9ca5a7c9dc13da3b781bdedf74da66c0ba57871224a2377f2e755"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.603195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kjs2b" event={"ID":"1ad70ef2-45cd-4139-a60e-0bda62597cb9","Type":"ContainerStarted","Data":"4d8cb2f330f86e5190db9758462ec8a01db2eb80d792de2a3f6dd37abcafabe3"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.604977 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.605841 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" event={"ID":"12d04376-4d45-4906-9772-84f7c9d313bf","Type":"ContainerStarted","Data":"6e6179c0497d15a2b056a34dd088efb3dd0a4d8605d3e1a60cac2cdfdc0a18ed"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.605872 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" event={"ID":"12d04376-4d45-4906-9772-84f7c9d313bf","Type":"ContainerStarted","Data":"4443fb178000b8a78ef5eb29ffedfb81d6ba49aecad7fda8d046af7496738389"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.606855 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.613701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" event={"ID":"41984359-fc99-4678-962a-b8c09f7c8e26","Type":"ContainerStarted","Data":"bc93a5b157426d4b53a2710afda4e323c1d6732b545606446bd9851dc3a3269d"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.614021 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" event={"ID":"41984359-fc99-4678-962a-b8c09f7c8e26","Type":"ContainerStarted","Data":"11a6870af1903a45f3a609f7c6aed747e2825d1d1163600cd5775c54bd4b72ba"} Oct 14 13:03:24 crc kubenswrapper[4837]: W1014 13:03:24.619358 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341ec536_bf38_4225_a0bc_da7f4837cdbc.slice/crio-bdd8169bb77120af39faf1f5876fd09bc59522b36e27d233939083bad4e9d4f8 WatchSource:0}: Error finding container bdd8169bb77120af39faf1f5876fd09bc59522b36e27d233939083bad4e9d4f8: Status 404 returned error can't find the container with id bdd8169bb77120af39faf1f5876fd09bc59522b36e27d233939083bad4e9d4f8 Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.620790 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-kjs2b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.620842 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kjs2b" podUID="1ad70ef2-45cd-4139-a60e-0bda62597cb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.623121 4837 patch_prober.go:28] interesting pod/console-operator-58897d9998-nmf8f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.623168 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" podUID="12d04376-4d45-4906-9772-84f7c9d313bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.625704 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" event={"ID":"99a2942a-8cfe-42b7-a339-4d7b30ee12be","Type":"ContainerStarted","Data":"5630af7430e09d43ce7fbb8e776f17726e160b46dd8919d374d5f6cb89f5c6a5"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.625756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" event={"ID":"99a2942a-8cfe-42b7-a339-4d7b30ee12be","Type":"ContainerStarted","Data":"d06a9d205d4af951510167d89bfb9b254c1d31e40b21e451e6525616aaff6cd6"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.629421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.629740 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.129726296 +0000 UTC m=+143.046726109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.637096 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" event={"ID":"e6c42468-5fc7-4a67-86d7-73c0f7589899","Type":"ContainerStarted","Data":"2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.637140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" event={"ID":"e6c42468-5fc7-4a67-86d7-73c0f7589899","Type":"ContainerStarted","Data":"a204ba1b6f60676c587ce4b794a5e1d5f686c3ce7749a8c4c28ef15dc41aa9f0"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.637826 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.640976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" event={"ID":"b67cdcc8-e9cd-4377-89b4-bef6191828b8","Type":"ContainerStarted","Data":"b479d221af28d77daa499de5dbb321df980b7836862ef939e30744db209a0ed5"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.641000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" event={"ID":"b67cdcc8-e9cd-4377-89b4-bef6191828b8","Type":"ContainerStarted","Data":"ea708925472ff5df4b27e2666c604b95b3a508aa281befc559e6c8eebf075767"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.643394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" event={"ID":"2f32f8e3-1b2f-4bf9-93e6-c7c649a97dca","Type":"ContainerStarted","Data":"fe99b658e9f8816aeb8592a46823617abd3605387dc4412934248f299b34c18e"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.645089 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dfwxh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.645136 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.646018 4837 generic.go:334] "Generic (PLEG): container finished" podID="9653fbf6-7b49-40eb-b8af-1c89f9ed3e88" containerID="5e1e91b971d1efbac19543d92a7f9cae9187357afc09d2e8d25d5fde3d2e5a4b" exitCode=0 Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.646092 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" event={"ID":"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88","Type":"ContainerDied","Data":"5e1e91b971d1efbac19543d92a7f9cae9187357afc09d2e8d25d5fde3d2e5a4b"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.646114 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" event={"ID":"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88","Type":"ContainerStarted","Data":"8f9ab450af9ee2cb65f337493c1391d5722d133e6becf5a3085879079571439b"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.658299 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6vd5d" event={"ID":"fb47e83f-903a-4420-9741-645bbbdf63c4","Type":"ContainerStarted","Data":"5dbffabcf973cecbf3a2c02f5754462231b0db6182275f4a5f2123ade48d92eb"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.667965 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.676419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" event={"ID":"50bef027-1010-4814-b1de-a758f875c57d","Type":"ContainerStarted","Data":"d732ead36ea47732a3125e6d9b15bfcf09f4dc23f4b027c9b471e78eab1af8a9"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.676460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" event={"ID":"50bef027-1010-4814-b1de-a758f875c57d","Type":"ContainerStarted","Data":"002005cef3a97526b837a29d56244d7dfa9872ee382a6d8717f66afe3b29c6b0"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.676471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" event={"ID":"50bef027-1010-4814-b1de-a758f875c57d","Type":"ContainerStarted","Data":"a2747739231965134cc0713403d230fc2f37dad5cd223ad14714da426cb6c8a0"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.696321 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.706769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" event={"ID":"60cdb7dd-63cf-4f28-ab2f-b58de493e006","Type":"ContainerStarted","Data":"fe903f3091d24ad54b832fd78e2dfc27d58f31f6cc2144d11a4227ff8f2e3ea5"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.707071 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" event={"ID":"60cdb7dd-63cf-4f28-ab2f-b58de493e006","Type":"ContainerStarted","Data":"fdf2bc3d0ff50d9c360fb70a7568b358bc452715b23b5a2b4848feefc5379a10"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.708841 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" event={"ID":"675483c3-eb80-41b4-b02b-db9059ec788b","Type":"ContainerStarted","Data":"7ee17b46ffc48aa5301cc7767de4827e3b61fa83a59a94b24c799775c4aff3ef"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.710018 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" event={"ID":"675483c3-eb80-41b4-b02b-db9059ec788b","Type":"ContainerStarted","Data":"0de2e97976280e2de685e954774e70dedfa57360c4475080a2601a1c77236f83"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.710949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" event={"ID":"8c4101d1-244d-4f5c-b059-54b9f26c225f","Type":"ContainerStarted","Data":"ae0044b44524386f0c56288101d91af1d74dd471fb3ac5f9f1a0be52400db674"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.710991 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" event={"ID":"8c4101d1-244d-4f5c-b059-54b9f26c225f","Type":"ContainerStarted","Data":"7e115981c77753e60a14c68d6a844ac6f56453e9bbdd60b7052ed3e110e254a0"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.715803 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.716669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" event={"ID":"d270838b-a09d-4fe8-be26-3310e7989953","Type":"ContainerStarted","Data":"f52e6382c9fa2b52a22616e742475392b600ba4027c02d95b490b31ec0672b8d"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.716713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" event={"ID":"d270838b-a09d-4fe8-be26-3310e7989953","Type":"ContainerStarted","Data":"7ee0413f8e3e4ab360d3d1c1e2e2fce36b23e00fc6e569bf2c8693b09759639e"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.721722 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" event={"ID":"a284d0f7-a004-45c1-9eb6-a500afacf05b","Type":"ContainerStarted","Data":"65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.721794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" event={"ID":"a284d0f7-a004-45c1-9eb6-a500afacf05b","Type":"ContainerStarted","Data":"4312bca648e55e0748617a41fa432695e4a04605a623fc63ad0818301085b871"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.722682 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.723636 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" event={"ID":"74ed9556-5676-44b1-aa3c-02eb697ab0a8","Type":"ContainerStarted","Data":"fd1a041d59be9d90ab8652ce9ef723bbab49bacd4850232d62aed9c651fcfbc1"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.724516 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" event={"ID":"64edb413-91a3-48ab-8d24-131c2d4fecb7","Type":"ContainerStarted","Data":"3c4a325735ed6f450667b69e99b26cdaf05c057ef1dedb91580f7f48ce7f4615"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.724535 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" event={"ID":"64edb413-91a3-48ab-8d24-131c2d4fecb7","Type":"ContainerStarted","Data":"df80ac6ea36ab7dce44d6baa44385366979676040858bf8aec25ad4cf03474e3"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.725074 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.729660 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vnpkw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.729720 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" podUID="a284d0f7-a004-45c1-9eb6-a500afacf05b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.732375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.734691 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" event={"ID":"3d789157-4dd2-4b8e-befc-84e8c03e6da6","Type":"ContainerStarted","Data":"643d7a34822e661e35667fb2fc5b02ab3bf57a6a8f915ed3ec196694f00faf87"} Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.734726 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" event={"ID":"3d789157-4dd2-4b8e-befc-84e8c03e6da6","Type":"ContainerStarted","Data":"ac0c86a0f316dbe5aca86b8ba87c0881cfe006a0c53e394fbeeff54aeac08d9d"} Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.735895 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.235880959 +0000 UTC m=+143.152880772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.740331 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8v59z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.740392 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" podUID="64edb413-91a3-48ab-8d24-131c2d4fecb7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.753771 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.822704 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6zbx7"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.822731 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.822741 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdj98"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.834527 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.834662 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.334644875 +0000 UTC m=+143.251644688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.835101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.841985 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.341972632 +0000 UTC m=+143.258972445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.919738 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xtgwh"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.922707 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm47m"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.936392 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:24 crc kubenswrapper[4837]: E1014 13:03:24.936752 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.43673885 +0000 UTC m=+143.353738663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.970974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb"] Oct 14 13:03:24 crc kubenswrapper[4837]: I1014 13:03:24.972981 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.042016 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.042492 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.542481354 +0000 UTC m=+143.459481167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.058484 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.058543 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.086000 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bj55n"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.102015 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95"] Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.143464 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.643446378 +0000 UTC m=+143.560446191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.143593 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.143945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.144409 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.644400004 +0000 UTC m=+143.561399817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.157929 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6r2j7" podStartSLOduration=122.157913768 podStartE2EDuration="2m2.157913768s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.156334365 +0000 UTC m=+143.073334178" watchObservedRunningTime="2025-10-14 13:03:25.157913768 +0000 UTC m=+143.074913581" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.176059 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.245523 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.246579 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" podStartSLOduration=122.246555491 podStartE2EDuration="2m2.246555491s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.20020697 +0000 UTC m=+143.117206783" watchObservedRunningTime="2025-10-14 13:03:25.246555491 +0000 UTC m=+143.163555304" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.246827 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.746805697 +0000 UTC m=+143.663805510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.247652 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-twbvc" podStartSLOduration=122.247644 podStartE2EDuration="2m2.247644s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.244046983 +0000 UTC m=+143.161046796" watchObservedRunningTime="2025-10-14 13:03:25.247644 +0000 UTC m=+143.164643813" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.278862 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x9tvw" podStartSLOduration=122.278844292 podStartE2EDuration="2m2.278844292s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.278383569 +0000 UTC m=+143.195383402" watchObservedRunningTime="2025-10-14 13:03:25.278844292 +0000 UTC m=+143.195844105" Oct 14 13:03:25 crc kubenswrapper[4837]: W1014 13:03:25.304868 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1b67fb_138a_4071_a0ce_c646e0ed7e7e.slice/crio-a13b9ad3e7c1b6e9c4162202c310bb1ed81d97eeb1c4f852d2a1ada3049eab69 WatchSource:0}: Error finding container a13b9ad3e7c1b6e9c4162202c310bb1ed81d97eeb1c4f852d2a1ada3049eab69: Status 404 returned error can't find the container with id a13b9ad3e7c1b6e9c4162202c310bb1ed81d97eeb1c4f852d2a1ada3049eab69 Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.310565 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.348014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.348312 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.848300646 +0000 UTC m=+143.765300459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.359546 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dcmj4" podStartSLOduration=122.359532689 podStartE2EDuration="2m2.359532689s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.359298603 +0000 UTC m=+143.276298426" watchObservedRunningTime="2025-10-14 13:03:25.359532689 +0000 UTC m=+143.276532502" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.360723 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.387758 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl"] Oct 14 13:03:25 crc kubenswrapper[4837]: W1014 13:03:25.439181 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e168cef_fe99_471f_89db_34290cbb6639.slice/crio-cce46d4401e5535d1caf93c71d62c0004258f57157c8d90a473890d12e65f4fa WatchSource:0}: Error finding container cce46d4401e5535d1caf93c71d62c0004258f57157c8d90a473890d12e65f4fa: Status 404 returned error can't find the container with id cce46d4401e5535d1caf93c71d62c0004258f57157c8d90a473890d12e65f4fa Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.449052 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.449293 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.949269111 +0000 UTC m=+143.866268924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.449702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.450086 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:25.950071692 +0000 UTC m=+143.867071585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.463908 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.486075 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tl2qn"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.550379 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.550792 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.05077495 +0000 UTC m=+143.967774773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.564130 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7fw57" podStartSLOduration=121.56410949 podStartE2EDuration="2m1.56410949s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.526728741 +0000 UTC m=+143.443728554" watchObservedRunningTime="2025-10-14 13:03:25.56410949 +0000 UTC m=+143.481109303" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.653095 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.653654 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.153636306 +0000 UTC m=+144.070636119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.673601 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf"] Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.685132 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-652k8" podStartSLOduration=122.685113025 podStartE2EDuration="2m2.685113025s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.683140052 +0000 UTC m=+143.600139865" watchObservedRunningTime="2025-10-14 13:03:25.685113025 +0000 UTC m=+143.602112838" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.756719 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.756850 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.25682804 +0000 UTC m=+144.173827853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.757208 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.757474 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.257466267 +0000 UTC m=+144.174466080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.787526 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdj98" event={"ID":"926381ee-78f6-4811-9695-0ebf216b3d8b","Type":"ContainerStarted","Data":"4173514c3c6655cf0e03494ca6b0d81b0449e4595cd7b0041bc1f0afc20a07c5"} Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.860313 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.860646 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.360631252 +0000 UTC m=+144.277631065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.926713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-94j6s" event={"ID":"341ec536-bf38-4225-a0bc-da7f4837cdbc","Type":"ContainerStarted","Data":"c56ee7123542ee26d5c625f859bf84a5d369521ba8b09298bea328acaa543d43"} Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.926902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-94j6s" event={"ID":"341ec536-bf38-4225-a0bc-da7f4837cdbc","Type":"ContainerStarted","Data":"bdd8169bb77120af39faf1f5876fd09bc59522b36e27d233939083bad4e9d4f8"} Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.964566 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" podStartSLOduration=121.964543335 podStartE2EDuration="2m1.964543335s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.907400264 +0000 UTC m=+143.824400077" watchObservedRunningTime="2025-10-14 13:03:25.964543335 +0000 UTC m=+143.881543148" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.965778 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:25 crc kubenswrapper[4837]: E1014 13:03:25.966282 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.466269162 +0000 UTC m=+144.383268975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.974311 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" event={"ID":"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7","Type":"ContainerStarted","Data":"c162084b48ad0ff19dd8e392d8492e8b03034121ccd22725c0a5e6350217beff"} Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.985556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" event={"ID":"3b22ce4d-5f14-40e9-943f-8368b104a4b9","Type":"ContainerStarted","Data":"f0e725004e60fe3424a6d49d45481ae037c9bf247c269b7347fc1e3157e143f0"} Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.991611 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" podStartSLOduration=122.991592126 podStartE2EDuration="2m2.991592126s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:25.97101987 +0000 UTC m=+143.888019703" watchObservedRunningTime="2025-10-14 13:03:25.991592126 +0000 UTC m=+143.908591939" Oct 14 13:03:25 crc kubenswrapper[4837]: I1014 13:03:25.993782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" event={"ID":"aeb6baa7-a962-4526-bb66-5907ac7c0141","Type":"ContainerStarted","Data":"d1a756d39169d4ead86b9eead85a56c4f8b29be205e3c4cf652deba0ed4d7bd9"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.007977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-59w7l" event={"ID":"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e","Type":"ContainerStarted","Data":"a13b9ad3e7c1b6e9c4162202c310bb1ed81d97eeb1c4f852d2a1ada3049eab69"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.032675 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6p2hx"] Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.032798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" event={"ID":"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6","Type":"ContainerStarted","Data":"bbda826c0ca7a87371d8c024698f690009c6281471f79e7cf62d8fd86aa6d1b4"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.033744 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c5zxj" podStartSLOduration=123.033723933 podStartE2EDuration="2m3.033723933s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.000300281 +0000 UTC m=+143.917300104" watchObservedRunningTime="2025-10-14 13:03:26.033723933 +0000 UTC m=+143.950723746" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.034282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" event={"ID":"b7f932ac-bd67-48e8-9f8d-b90218acaeda","Type":"ContainerStarted","Data":"2d048a8d7524178cde803841f4c29045a0cdf37fdf63a5f2c30c27517cd6ba21"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.045745 4837 generic.go:334] "Generic (PLEG): container finished" podID="74ed9556-5676-44b1-aa3c-02eb697ab0a8" containerID="e99735b194ff95de5161c768b7d228473135605c50665bd6c3bbbe12eb775b8b" exitCode=0 Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.045806 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" event={"ID":"74ed9556-5676-44b1-aa3c-02eb697ab0a8","Type":"ContainerDied","Data":"e99735b194ff95de5161c768b7d228473135605c50665bd6c3bbbe12eb775b8b"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.054332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" event={"ID":"2e342172-b3e8-4e3b-b4a5-0d050095e20a","Type":"ContainerStarted","Data":"548c5359774dbf84580e1514bac38067589a09746d8f05508acb4cd813436edd"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.062398 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-kjs2b" podStartSLOduration=123.062384416 podStartE2EDuration="2m3.062384416s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.061693547 +0000 UTC m=+143.978693360" watchObservedRunningTime="2025-10-14 13:03:26.062384416 +0000 UTC m=+143.979384229" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.067793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" event={"ID":"708198b4-2426-4e20-8731-0cdbf6083496","Type":"ContainerStarted","Data":"171c73dbfe3b72995124c58557879f650186ebd934db2c829f34e5166aba6f63"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.068455 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.069617 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.569600241 +0000 UTC m=+144.486600054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.077924 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" event={"ID":"17c00ce0-01af-4442-ad19-cc2ab6ff94dc","Type":"ContainerStarted","Data":"b4bc77bd481c8b326612b7af574d5511c9a485818cfc8edf29b722a18c16d8d9"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.084414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" event={"ID":"05a8461c-66c0-46a5-82cd-c0f075fd5842","Type":"ContainerStarted","Data":"83a4f74c4d3a85394b2e232c682d6a04e08312d1d3970691f5a64172865f1802"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.088106 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" event={"ID":"0e168cef-fe99-471f-89db-34290cbb6639","Type":"ContainerStarted","Data":"cce46d4401e5535d1caf93c71d62c0004258f57157c8d90a473890d12e65f4fa"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.106789 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr"] Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.119918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" event={"ID":"1577b547-7e30-4b8e-9959-fdd88088041c","Type":"ContainerStarted","Data":"3922521ecae87aaa4209ee90f2fc8c999bf1f14950467cdeb968b79e89472951"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.125879 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" podStartSLOduration=123.125855389 podStartE2EDuration="2m3.125855389s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.122127078 +0000 UTC m=+144.039126891" watchObservedRunningTime="2025-10-14 13:03:26.125855389 +0000 UTC m=+144.042855212" Oct 14 13:03:26 crc kubenswrapper[4837]: W1014 13:03:26.131805 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb34230_e125_4ed5_86dc_f6bc57bb7f51.slice/crio-4874b170497b18cd9a6c2f826b6655e10b70180d8c96996a55e9f40337ce35a7 WatchSource:0}: Error finding container 4874b170497b18cd9a6c2f826b6655e10b70180d8c96996a55e9f40337ce35a7: Status 404 returned error can't find the container with id 4874b170497b18cd9a6c2f826b6655e10b70180d8c96996a55e9f40337ce35a7 Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.145975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6vd5d" event={"ID":"fb47e83f-903a-4420-9741-645bbbdf63c4","Type":"ContainerStarted","Data":"189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25"} Oct 14 13:03:26 crc kubenswrapper[4837]: W1014 13:03:26.168964 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a06e01d_de06_4ec8_90b7_3dd1e1d3515e.slice/crio-f5d4afa0fc94deb9dc6dfddd1ef95a31bb8c4b5ff648f48941a359cedff425b7 WatchSource:0}: Error finding container f5d4afa0fc94deb9dc6dfddd1ef95a31bb8c4b5ff648f48941a359cedff425b7: Status 404 returned error can't find the container with id f5d4afa0fc94deb9dc6dfddd1ef95a31bb8c4b5ff648f48941a359cedff425b7 Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.169640 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.169923 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.669911417 +0000 UTC m=+144.586911230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.179695 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tl2qn" event={"ID":"d18a3baf-1b3a-4822-aa24-47bb7cda4725","Type":"ContainerStarted","Data":"7dc7012ce3667707c550e539adb9a73623a647aec7b96c6b40154c777db0f046"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.190553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" event={"ID":"b81ca45e-66f6-49c7-9963-b75b6d87c91f","Type":"ContainerStarted","Data":"0962d7389faeec1be1b1e7ea96c2df53fe7c88268aa8c23c67e614d24531c085"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.198912 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" event={"ID":"13b89429-bd07-413e-9436-fe6f28d882ff","Type":"ContainerStarted","Data":"3e988775f5f42923603f63334df1e09d5364dd1eed9747609ad472afc840dcb4"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.199566 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.199582 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl"] Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.202735 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" event={"ID":"675483c3-eb80-41b4-b02b-db9059ec788b","Type":"ContainerStarted","Data":"a1a4ec2bfd1b89de77567675ddad851c1171397714a247880c9d4fec021cc823"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.207648 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" event={"ID":"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5","Type":"ContainerStarted","Data":"7258dee44d3606211a2853abcec1932ac258021a166fc4a97ea972e9542ae34e"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.208942 4837 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-tkgxv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.208989 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" podUID="13b89429-bd07-413e-9436-fe6f28d882ff" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.214456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" event={"ID":"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe","Type":"ContainerStarted","Data":"79dcd9046a165aa773a29073f4a7c50c3f8739bd16af713d24243e97e3080071"} Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.256813 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dfwxh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.256860 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.260105 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-kjs2b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.260172 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kjs2b" podUID="1ad70ef2-45cd-4139-a60e-0bda62597cb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.274723 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.276587 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.776562125 +0000 UTC m=+144.693561998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.282242 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.330022 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" podStartSLOduration=123.330005767 podStartE2EDuration="2m3.330005767s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.329292658 +0000 UTC m=+144.246292481" watchObservedRunningTime="2025-10-14 13:03:26.330005767 +0000 UTC m=+144.247005580" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.358988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.359551 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.359613 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.388989 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.390473 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.890457449 +0000 UTC m=+144.807457262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.466588 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nmf8f" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.497602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.498696 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:26.998670799 +0000 UTC m=+144.915670612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.510430 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6vd5d" podStartSLOduration=123.510410896 podStartE2EDuration="2m3.510410896s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.482974915 +0000 UTC m=+144.399974728" watchObservedRunningTime="2025-10-14 13:03:26.510410896 +0000 UTC m=+144.427410709" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.563581 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" podStartSLOduration=122.56356313 podStartE2EDuration="2m2.56356313s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.523242622 +0000 UTC m=+144.440242435" watchObservedRunningTime="2025-10-14 13:03:26.56356313 +0000 UTC m=+144.480562943" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.599511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.599863 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.09985208 +0000 UTC m=+145.016851893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.604945 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-94j6s" podStartSLOduration=122.604925987 podStartE2EDuration="2m2.604925987s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.563524839 +0000 UTC m=+144.480524652" watchObservedRunningTime="2025-10-14 13:03:26.604925987 +0000 UTC m=+144.521925800" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.658725 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" podStartSLOduration=122.658697718 podStartE2EDuration="2m2.658697718s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.607922748 +0000 UTC m=+144.524922571" watchObservedRunningTime="2025-10-14 13:03:26.658697718 +0000 UTC m=+144.575697531" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.692953 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dfjx8" podStartSLOduration=122.692936891 podStartE2EDuration="2m2.692936891s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.690011082 +0000 UTC m=+144.607010905" watchObservedRunningTime="2025-10-14 13:03:26.692936891 +0000 UTC m=+144.609936704" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.700723 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.701015 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.201002579 +0000 UTC m=+145.118002392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.732550 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" podStartSLOduration=123.73253366 podStartE2EDuration="2m3.73253366s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.729118898 +0000 UTC m=+144.646118721" watchObservedRunningTime="2025-10-14 13:03:26.73253366 +0000 UTC m=+144.649533473" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.769496 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" podStartSLOduration=122.769475646 podStartE2EDuration="2m2.769475646s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:26.764301397 +0000 UTC m=+144.681301210" watchObservedRunningTime="2025-10-14 13:03:26.769475646 +0000 UTC m=+144.686475459" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.803884 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.804168 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.304142863 +0000 UTC m=+145.221142676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.905398 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:26 crc kubenswrapper[4837]: E1014 13:03:26.905781 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.405767535 +0000 UTC m=+145.322767348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.940240 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:26 crc kubenswrapper[4837]: I1014 13:03:26.940289 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.007015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.007393 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.507377986 +0000 UTC m=+145.424377799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.028831 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.107749 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.108181 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.608151606 +0000 UTC m=+145.525151419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.209101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.209822 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.70981067 +0000 UTC m=+145.626810483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.310717 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.310934 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.810908568 +0000 UTC m=+145.727908381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.310995 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.311370 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.81136351 +0000 UTC m=+145.728363323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.313907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tl2qn" event={"ID":"d18a3baf-1b3a-4822-aa24-47bb7cda4725","Type":"ContainerStarted","Data":"a4cc87b8ce967a60bec377f0278619a89bebb4671b8da354ca027a91d647f07a"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.315595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" event={"ID":"7ba8ba21-92c6-43a9-9d36-7d1d703f6aa6","Type":"ContainerStarted","Data":"93927a2286d646a79960c1ffc6890d3aba6f5d5f9edd51526d1c49868636dcd5"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.319754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" event={"ID":"17c00ce0-01af-4442-ad19-cc2ab6ff94dc","Type":"ContainerStarted","Data":"f0f6baf59807bb86c23b6c0a9cd97adc138c87db63014dcd5b2573f8415dc4cb"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.326621 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" event={"ID":"708198b4-2426-4e20-8731-0cdbf6083496","Type":"ContainerStarted","Data":"e94dd848345fbf49edcd13d380b9e13388503ab30e3db2edbd1c0b6bb4b325c9"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.326875 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.328771 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" event={"ID":"aeb6baa7-a962-4526-bb66-5907ac7c0141","Type":"ContainerStarted","Data":"2b5b1ed0b60dc92c52d812d2d8a3bce9ab6df59ac72dba3cc2658a335ed12024"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.329481 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.331284 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h56db container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.331331 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" podUID="708198b4-2426-4e20-8731-0cdbf6083496" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.340941 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rtlpl" podStartSLOduration=123.340914347 podStartE2EDuration="2m3.340914347s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.340257289 +0000 UTC m=+145.257257102" watchObservedRunningTime="2025-10-14 13:03:27.340914347 +0000 UTC m=+145.257914160" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.351640 4837 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-td9k8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.351692 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" podUID="aeb6baa7-a962-4526-bb66-5907ac7c0141" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.362852 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:27 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:27 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:27 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.362916 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.386448 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" podStartSLOduration=123.386434696 podStartE2EDuration="2m3.386434696s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.383772374 +0000 UTC m=+145.300772187" watchObservedRunningTime="2025-10-14 13:03:27.386434696 +0000 UTC m=+145.303434509" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.400863 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" event={"ID":"2e342172-b3e8-4e3b-b4a5-0d050095e20a","Type":"ContainerStarted","Data":"64b7812401400d74bd938e69f3a28e4eb9c90f93957002332e4cea996411ef3a"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.400910 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" event={"ID":"2e342172-b3e8-4e3b-b4a5-0d050095e20a","Type":"ContainerStarted","Data":"85ccbc0cf94349300c14a41bcf50f931181225c25a1bfa61a857eb93f2e8fda9"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.412241 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.414225 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:27.914203664 +0000 UTC m=+145.831203477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.414224 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-59w7l" event={"ID":"4d1b67fb-138a-4071-a0ce-c646e0ed7e7e","Type":"ContainerStarted","Data":"d9256fa5cf26ac310d15a98ec263a5b2e6b6b3642db954ac3f6b449fd3a56729"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.451706 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" event={"ID":"b7f932ac-bd67-48e8-9f8d-b90218acaeda","Type":"ContainerStarted","Data":"09af55c061db1289111d2d86e9ef9d12f780fcfb033e44866a2d215fb393cfb9"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.463829 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" event={"ID":"bfb34230-e125-4ed5-86dc-f6bc57bb7f51","Type":"ContainerStarted","Data":"d75210cbae8cd075166f04fc33ffbff89f84f3cb6fc3ebd00c724c110e0877c2"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.463881 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" event={"ID":"bfb34230-e125-4ed5-86dc-f6bc57bb7f51","Type":"ContainerStarted","Data":"4874b170497b18cd9a6c2f826b6655e10b70180d8c96996a55e9f40337ce35a7"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.486533 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" podStartSLOduration=123.486516826 podStartE2EDuration="2m3.486516826s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.475642172 +0000 UTC m=+145.392641985" watchObservedRunningTime="2025-10-14 13:03:27.486516826 +0000 UTC m=+145.403516639" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.501466 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" event={"ID":"048a43ae-98e0-489b-9ef4-c63f44881fa0","Type":"ContainerStarted","Data":"4f8da56f0dbecbb7eeb8219d5177f806aaf61091321b87b6815898d72af427a0"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.501520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" event={"ID":"048a43ae-98e0-489b-9ef4-c63f44881fa0","Type":"ContainerStarted","Data":"4f0a8e8663846e8c823a37d3b20631fec6a2adaa6f5e60d49921f98dbd535e79"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.513587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" event={"ID":"0e168cef-fe99-471f-89db-34290cbb6639","Type":"ContainerStarted","Data":"5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.513930 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.514066 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.514678 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.014664365 +0000 UTC m=+145.931664178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.533571 4837 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sm47m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.533612 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" podUID="0e168cef-fe99-471f-89db-34290cbb6639" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.533678 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24l9g" podStartSLOduration=123.533668348 podStartE2EDuration="2m3.533668348s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.532431556 +0000 UTC m=+145.449431369" watchObservedRunningTime="2025-10-14 13:03:27.533668348 +0000 UTC m=+145.450668161" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.558386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" event={"ID":"74ed9556-5676-44b1-aa3c-02eb697ab0a8","Type":"ContainerStarted","Data":"ea343276d3948808212f45615cd32e51b15118df237d241422187df5c19e75ed"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.560127 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mfkfl" podStartSLOduration=123.560110303 podStartE2EDuration="2m3.560110303s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.557493702 +0000 UTC m=+145.474493525" watchObservedRunningTime="2025-10-14 13:03:27.560110303 +0000 UTC m=+145.477110116" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.590766 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" event={"ID":"13b89429-bd07-413e-9436-fe6f28d882ff","Type":"ContainerStarted","Data":"e3a9053d98dbe3cd27be6090a011caf1cf400bc1866582b1b453ef67b516b012"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.609370 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" podStartSLOduration=123.609343501 podStartE2EDuration="2m3.609343501s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.589266859 +0000 UTC m=+145.506266672" watchObservedRunningTime="2025-10-14 13:03:27.609343501 +0000 UTC m=+145.526343314" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.620042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.620238 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.120209114 +0000 UTC m=+146.037208927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.620307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.621449 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.121436867 +0000 UTC m=+146.038436680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.629929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6zbx7" event={"ID":"ef8e676b-eb8b-4b8a-943e-0a9c6802ecfe","Type":"ContainerStarted","Data":"0fd1852173442d8815983c58297d0802afdcce549128aa21a0df7ca43003c6af"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.661470 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-59w7l" podStartSLOduration=6.661454277 podStartE2EDuration="6.661454277s" podCreationTimestamp="2025-10-14 13:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.61114192 +0000 UTC m=+145.528141733" watchObservedRunningTime="2025-10-14 13:03:27.661454277 +0000 UTC m=+145.578454090" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.662642 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" event={"ID":"1577b547-7e30-4b8e-9959-fdd88088041c","Type":"ContainerStarted","Data":"3ea78e86286f78728c3aa6379ed1794de3473e94ef13c8ca38780ddc5e7e8b52"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.664377 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tkgxv" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.666084 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" event={"ID":"9653fbf6-7b49-40eb-b8af-1c89f9ed3e88","Type":"ContainerStarted","Data":"1fdd2e968d3931f4d8d42f139ea3e686655ee741356d52759d47bd105556e5af"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.666747 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.667849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdj98" event={"ID":"926381ee-78f6-4811-9695-0ebf216b3d8b","Type":"ContainerStarted","Data":"1c6f9bbb3111f53e8bd50791810a3a62d03d441da388e7f2bf9a7730533f5a9c"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.669169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" event={"ID":"05a8461c-66c0-46a5-82cd-c0f075fd5842","Type":"ContainerStarted","Data":"940ecb42560bf4e05e45f5a135daddc3ad36b2873ec630189797671e250c7dad"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.672743 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" event={"ID":"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e","Type":"ContainerStarted","Data":"09a60ab6b4c8b3c899436d8d2df476e7df19a5c47e44d89639e2bb104a86b537"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.672779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" event={"ID":"9a06e01d-de06-4ec8-90b7-3dd1e1d3515e","Type":"ContainerStarted","Data":"f5d4afa0fc94deb9dc6dfddd1ef95a31bb8c4b5ff648f48941a359cedff425b7"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.692485 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" event={"ID":"b81ca45e-66f6-49c7-9963-b75b6d87c91f","Type":"ContainerStarted","Data":"af3b94ab924735d7334590ecdfc8649ce1fe8ac9f42148a9cbb44fcb6795b587"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.701197 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xplrd" podStartSLOduration=123.701168809 podStartE2EDuration="2m3.701168809s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.662220078 +0000 UTC m=+145.579219891" watchObservedRunningTime="2025-10-14 13:03:27.701168809 +0000 UTC m=+145.618168622" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.710138 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" event={"ID":"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5","Type":"ContainerStarted","Data":"c0160032115176d3853cc8ec051c96849cb3424aee5ed74716737b2714a99d57"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.722583 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.722689 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.222668999 +0000 UTC m=+146.139668802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.723294 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.725464 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.225455764 +0000 UTC m=+146.142455577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.734375 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" podStartSLOduration=123.734361505 podStartE2EDuration="2m3.734361505s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.707077578 +0000 UTC m=+145.624077391" watchObservedRunningTime="2025-10-14 13:03:27.734361505 +0000 UTC m=+145.651361318" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.735315 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gdj98" podStartSLOduration=6.7353107 podStartE2EDuration="6.7353107s" podCreationTimestamp="2025-10-14 13:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.733013348 +0000 UTC m=+145.650013161" watchObservedRunningTime="2025-10-14 13:03:27.7353107 +0000 UTC m=+145.652310513" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.740328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" event={"ID":"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7","Type":"ContainerStarted","Data":"91fd76ae88d86701b7d4c0deda1d42cbb46eacbe11a745012f93e54d014f2b98"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.740369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" event={"ID":"a8a3a1d7-2659-42f9-92c3-2086d0ab27f7","Type":"ContainerStarted","Data":"571f62c06410c8bdcb4b230ccb0a8bd98d57e2de7724177149321f8faff16f55"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.741152 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.775510 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" event={"ID":"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845","Type":"ContainerStarted","Data":"2ecaf72684c6384eec8bb65713b447c9f0864ce645bc6228a84028e9e6d0323c"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.775796 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" event={"ID":"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845","Type":"ContainerStarted","Data":"676107aef1875d78793a0585354662d4520530d62a3dd01ac11192f01c3f4055"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.778253 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n6d46" event={"ID":"d270838b-a09d-4fe8-be26-3310e7989953","Type":"ContainerStarted","Data":"a965b22d7704fa402d565b79a1af5bf0c932b99425dce04d0797fded429b1564"} Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.809828 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.824289 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.824660 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.324644631 +0000 UTC m=+146.241644434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.832308 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wzp95" podStartSLOduration=123.832297088 podStartE2EDuration="2m3.832297088s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.793523671 +0000 UTC m=+145.710523494" watchObservedRunningTime="2025-10-14 13:03:27.832297088 +0000 UTC m=+145.749296901" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.863641 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" podStartSLOduration=124.863625163 podStartE2EDuration="2m4.863625163s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.862379108 +0000 UTC m=+145.779378921" watchObservedRunningTime="2025-10-14 13:03:27.863625163 +0000 UTC m=+145.780624976" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.893237 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-k7zhr" podStartSLOduration=123.893220081 podStartE2EDuration="2m3.893220081s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.892787199 +0000 UTC m=+145.809787012" watchObservedRunningTime="2025-10-14 13:03:27.893220081 +0000 UTC m=+145.810219884" Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.926393 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:27 crc kubenswrapper[4837]: E1014 13:03:27.927583 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.427567858 +0000 UTC m=+146.344567671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:27 crc kubenswrapper[4837]: I1014 13:03:27.949067 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bbljb" podStartSLOduration=123.949051117 podStartE2EDuration="2m3.949051117s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:27.948523103 +0000 UTC m=+145.865522916" watchObservedRunningTime="2025-10-14 13:03:27.949051117 +0000 UTC m=+145.866050940" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.013451 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" podStartSLOduration=124.013435225 podStartE2EDuration="2m4.013435225s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:28.011022391 +0000 UTC m=+145.928022204" watchObservedRunningTime="2025-10-14 13:03:28.013435225 +0000 UTC m=+145.930435038" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.028627 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.028999 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.528981685 +0000 UTC m=+146.445981498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.130303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.130701 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.630684279 +0000 UTC m=+146.547684092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.230958 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.231128 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.731098389 +0000 UTC m=+146.648098202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.231306 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.231602 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.731590332 +0000 UTC m=+146.648590145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.332990 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.333180 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.833133533 +0000 UTC m=+146.750133346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.333239 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.333566 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.833557444 +0000 UTC m=+146.750557257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.361226 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:28 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:28 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:28 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.361301 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.434091 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.434285 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.934257402 +0000 UTC m=+146.851257215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.434662 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.434946 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:28.93493393 +0000 UTC m=+146.851933743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.484783 4837 patch_prober.go:28] interesting pod/apiserver-76f77b778f-k88lw container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]log ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]etcd ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:03:28 crc kubenswrapper[4837]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 14 13:03:28 crc kubenswrapper[4837]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:03:28 crc kubenswrapper[4837]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:03:28 crc kubenswrapper[4837]: livez check failed Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.484837 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" podUID="60cdb7dd-63cf-4f28-ab2f-b58de493e006" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.535801 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.536002 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.035977466 +0000 UTC m=+146.952977279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.536183 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.536533 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.036520361 +0000 UTC m=+146.953520164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.637712 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.637897 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.137871866 +0000 UTC m=+147.054871679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.638010 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.638337 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.138329648 +0000 UTC m=+147.055329461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.739730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.739938 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.239896409 +0000 UTC m=+147.156896232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.740024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.740409 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.240393263 +0000 UTC m=+147.157393076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.789853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" event={"ID":"bfb34230-e125-4ed5-86dc-f6bc57bb7f51","Type":"ContainerStarted","Data":"3154d6b6c8535347332fd4624eaf7e0f569fdf8b0981c866599ae54d101dd037"} Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.790257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" event={"ID":"048a43ae-98e0-489b-9ef4-c63f44881fa0","Type":"ContainerStarted","Data":"3fbd00d87acd0f35a5e5a9724b9efa2253705b8f28b75e9c8a1e997ff019d352"} Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.792620 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" event={"ID":"05a8461c-66c0-46a5-82cd-c0f075fd5842","Type":"ContainerStarted","Data":"f294832f14b9981f58ffc9f636b8a44edbafd0cae56f0ab9064cf51f5a86dcf9"} Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.793916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tl2qn" event={"ID":"d18a3baf-1b3a-4822-aa24-47bb7cda4725","Type":"ContainerStarted","Data":"b1b207de7b25b1f4d312bb2807306ba61ffed8f33da66125f01cc7851614bc83"} Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.794119 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.795527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" event={"ID":"3b22ce4d-5f14-40e9-943f-8368b104a4b9","Type":"ContainerStarted","Data":"a15d60d48879ca6d2d7b69b3f6ab91ceb6f87dedcfe14abd48a6877455cd8eab"} Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.800957 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8s7zf" event={"ID":"7a1fcfdc-5c1d-4fbc-8f2a-08cc2ac7f845","Type":"ContainerStarted","Data":"8621d5cff8d5cd4a13bf342fe79d3c0d4cebb8f956613b9c4f985043d6e5ba37"} Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.801502 4837 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sm47m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.801543 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" podUID="0e168cef-fe99-471f-89db-34290cbb6639" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.810589 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-td9k8" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.814858 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.814911 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.840784 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.840909 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.340886934 +0000 UTC m=+147.257886747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.841512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.843300 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.343289559 +0000 UTC m=+147.260289372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.863934 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p2hx" podStartSLOduration=124.863914876 podStartE2EDuration="2m4.863914876s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:28.863906585 +0000 UTC m=+146.780906398" watchObservedRunningTime="2025-10-14 13:03:28.863914876 +0000 UTC m=+146.780914689" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.864303 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" podStartSLOduration=124.864297776 podStartE2EDuration="2m4.864297776s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:28.037952777 +0000 UTC m=+145.954952590" watchObservedRunningTime="2025-10-14 13:03:28.864297776 +0000 UTC m=+146.781297589" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.922585 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bj55n" podStartSLOduration=124.922562828 podStartE2EDuration="2m4.922562828s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:28.901717325 +0000 UTC m=+146.818717138" watchObservedRunningTime="2025-10-14 13:03:28.922562828 +0000 UTC m=+146.839562641" Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.943709 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.944033 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.444005017 +0000 UTC m=+147.361004830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.944091 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:28 crc kubenswrapper[4837]: E1014 13:03:28.944484 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.4444657 +0000 UTC m=+147.361465513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:28 crc kubenswrapper[4837]: I1014 13:03:28.996638 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-g98dl" podStartSLOduration=124.996620646 podStartE2EDuration="2m4.996620646s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:28.993512783 +0000 UTC m=+146.910512596" watchObservedRunningTime="2025-10-14 13:03:28.996620646 +0000 UTC m=+146.913620459" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.034554 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tl2qn" podStartSLOduration=8.03453867 podStartE2EDuration="8.03453867s" podCreationTimestamp="2025-10-14 13:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:29.034488319 +0000 UTC m=+146.951488132" watchObservedRunningTime="2025-10-14 13:03:29.03453867 +0000 UTC m=+146.951538483" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.046424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.046732 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.546717188 +0000 UTC m=+147.463717001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.149747 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.150107 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.650091918 +0000 UTC m=+147.567091731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.250664 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.251001 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.750986681 +0000 UTC m=+147.667986494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.351670 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.352055 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.852037928 +0000 UTC m=+147.769037741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.368087 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:29 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:29 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:29 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.368172 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.443802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n8266"] Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.445220 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.449241 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.452257 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.452390 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.952374145 +0000 UTC m=+147.869373958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.452574 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-utilities\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.452625 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-catalog-content\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.452677 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.452718 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqtm\" (UniqueName: \"kubernetes.io/projected/49e47458-6044-4966-a0e5-3a8e243414f8-kube-api-access-8dqtm\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.453015 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:29.953003942 +0000 UTC m=+147.870003755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.456545 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.519442 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n8266"] Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.553493 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.553789 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.053765561 +0000 UTC m=+147.970765374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.553948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqtm\" (UniqueName: \"kubernetes.io/projected/49e47458-6044-4966-a0e5-3a8e243414f8-kube-api-access-8dqtm\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.554015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-utilities\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.554041 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-catalog-content\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.554101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.554407 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.054394858 +0000 UTC m=+147.971394671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.555471 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-utilities\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.555702 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-catalog-content\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.593050 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqtm\" (UniqueName: \"kubernetes.io/projected/49e47458-6044-4966-a0e5-3a8e243414f8-kube-api-access-8dqtm\") pod \"certified-operators-n8266\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.649991 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p5s4"] Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.651496 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.655470 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.655573 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.155555829 +0000 UTC m=+148.072555642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.655696 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.655980 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.15596843 +0000 UTC m=+148.072968243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.658319 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.711224 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p5s4"] Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.712034 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h56db" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.756692 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.756908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-utilities\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.756958 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.256931394 +0000 UTC m=+148.173931207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.757019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbzk7\" (UniqueName: \"kubernetes.io/projected/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-kube-api-access-dbzk7\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.757081 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-catalog-content\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.757182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.757476 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.257463528 +0000 UTC m=+148.174463341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.761280 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.816387 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tvv2p" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.823069 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" event={"ID":"3b22ce4d-5f14-40e9-943f-8368b104a4b9","Type":"ContainerStarted","Data":"5308bbc3ba47162fe2ebf6caac16b3ead7215070972b29bee79d52c6b408edc9"} Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.823127 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" event={"ID":"3b22ce4d-5f14-40e9-943f-8368b104a4b9","Type":"ContainerStarted","Data":"32e0108d0ab40b6a96995d730b227518a2550a308533f4220664fffe57b01593"} Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.823137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" event={"ID":"3b22ce4d-5f14-40e9-943f-8368b104a4b9","Type":"ContainerStarted","Data":"73e64b539c50081f4780834ce86af3619198a9057e8b5e1e71dffdd7ada30a0c"} Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.832374 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.834928 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fq9rr" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.859877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.860251 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.360229982 +0000 UTC m=+148.277229795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.861252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbzk7\" (UniqueName: \"kubernetes.io/projected/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-kube-api-access-dbzk7\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.861647 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-catalog-content\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.861675 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vh2lz"] Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.862559 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.861683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.863343 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.863375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.863476 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.863529 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.863563 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-utilities\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.865805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-catalog-content\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.866434 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-utilities\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.867110 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.367094037 +0000 UTC m=+148.284093850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.870591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.876413 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.881495 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.881691 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.890946 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vh2lz"] Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.923340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbzk7\" (UniqueName: \"kubernetes.io/projected/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-kube-api-access-dbzk7\") pod \"community-operators-8p5s4\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.967613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:03:29 crc kubenswrapper[4837]: I1014 13:03:29.970654 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:29 crc kubenswrapper[4837]: E1014 13:03:29.970992 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.47097464 +0000 UTC m=+148.387974453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.035090 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.044490 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2lq7f"] Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.045351 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.045785 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.076636 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-utilities\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077102 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-catalog-content\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-catalog-content\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077236 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077263 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-utilities\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhxw\" (UniqueName: \"kubernetes.io/projected/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-kube-api-access-kwhxw\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.077332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvdpx\" (UniqueName: \"kubernetes.io/projected/eb913442-9b33-4a88-a111-164879c37512-kube-api-access-lvdpx\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.077594 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.577581907 +0000 UTC m=+148.494581720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.107416 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lq7f"] Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.116193 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xtgwh" podStartSLOduration=9.116150238 podStartE2EDuration="9.116150238s" podCreationTimestamp="2025-10-14 13:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:30.091452801 +0000 UTC m=+148.008452614" watchObservedRunningTime="2025-10-14 13:03:30.116150238 +0000 UTC m=+148.033150131" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.178927 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-utilities\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhxw\" (UniqueName: \"kubernetes.io/projected/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-kube-api-access-kwhxw\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvdpx\" (UniqueName: \"kubernetes.io/projected/eb913442-9b33-4a88-a111-164879c37512-kube-api-access-lvdpx\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179190 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-utilities\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179211 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-catalog-content\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-catalog-content\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.179951 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-catalog-content\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.182675 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.682655493 +0000 UTC m=+148.599655306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.183011 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-utilities\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.183310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-utilities\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.183626 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-catalog-content\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.224345 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvdpx\" (UniqueName: \"kubernetes.io/projected/eb913442-9b33-4a88-a111-164879c37512-kube-api-access-lvdpx\") pod \"community-operators-2lq7f\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.245040 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhxw\" (UniqueName: \"kubernetes.io/projected/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-kube-api-access-kwhxw\") pod \"certified-operators-vh2lz\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.280808 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.281147 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.78113076 +0000 UTC m=+148.698130573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.282559 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n8266"] Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.367701 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:30 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:30 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:30 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.367737 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.386535 4837 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.387828 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.388228 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.888212339 +0000 UTC m=+148.805212152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.423475 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.452511 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p5s4"] Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.488876 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.489152 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:30.989140383 +0000 UTC m=+148.906140196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.514711 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.590362 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.590502 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:31.090479627 +0000 UTC m=+149.007479440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.590909 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.591207 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:31.091194627 +0000 UTC m=+149.008194440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.691897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.692293 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:31.192275905 +0000 UTC m=+149.109275708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.750501 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vh2lz"] Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.792652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.793030 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:31.293011463 +0000 UTC m=+149.210011356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.833721 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh2lz" event={"ID":"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4","Type":"ContainerStarted","Data":"35b0428104c0bbfccae6379c066ce477a4ef32d9e5a5953a0c7555453bf0ab15"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.836366 4837 generic.go:334] "Generic (PLEG): container finished" podID="49e47458-6044-4966-a0e5-3a8e243414f8" containerID="9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066" exitCode=0 Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.836419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerDied","Data":"9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.836436 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerStarted","Data":"710e6da397c983c3c704f236eabe147caf23fb20e1fe8d8122319f76571e15f9"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.837750 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.840888 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"868ce84a486b453587bb898f2560264005ee3322920553a0cfd183be673555ca"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.840908 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"503695b50b7260077d62f0e40ff3e5c768d58186d0d1995d0244469b5ccfe34a"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.841847 4837 generic.go:334] "Generic (PLEG): container finished" podID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerID="a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db" exitCode=0 Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.841879 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p5s4" event={"ID":"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9","Type":"ContainerDied","Data":"a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.841892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p5s4" event={"ID":"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9","Type":"ContainerStarted","Data":"16f5d2bac1ad0b2509a9ed9b4bbdcd66181a3998016196bdba8e30b737f83edf"} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.843711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fe23909826c477e74f318df6000cdf6dad7ce7f74e3d7f95b04c8e43c48ff155"} Oct 14 13:03:30 crc kubenswrapper[4837]: W1014 13:03:30.844843 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-9914385c93dddd4cf19d6f8edb3dc2754ecd7a4dbde267301d01d9ccb25278c4 WatchSource:0}: Error finding container 9914385c93dddd4cf19d6f8edb3dc2754ecd7a4dbde267301d01d9ccb25278c4: Status 404 returned error can't find the container with id 9914385c93dddd4cf19d6f8edb3dc2754ecd7a4dbde267301d01d9ccb25278c4 Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.894306 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.894787 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-14 13:03:31.394756299 +0000 UTC m=+149.311756112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.895002 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:30 crc kubenswrapper[4837]: E1014 13:03:30.898645 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-14 13:03:31.398630493 +0000 UTC m=+149.315630406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqs75" (UID: "14165edd-b69a-4886-8405-09298571b47b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.917382 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lq7f"] Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.968070 4837 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-14T13:03:30.38673886Z","Handler":null,"Name":""} Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.970960 4837 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.970986 4837 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 14 13:03:30 crc kubenswrapper[4837]: I1014 13:03:30.995771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.025758 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.097080 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.100928 4837 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.100968 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.153233 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqs75\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.211586 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.361430 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:31 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:31 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:31 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.361732 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.410129 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqs75"] Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.428604 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krz7p"] Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.429521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.431071 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.437239 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krz7p"] Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.500539 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-utilities\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.500619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-catalog-content\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.500740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2hjx\" (UniqueName: \"kubernetes.io/projected/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-kube-api-access-w2hjx\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.602258 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2hjx\" (UniqueName: \"kubernetes.io/projected/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-kube-api-access-w2hjx\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.602326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-utilities\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.602366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-catalog-content\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.603019 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-catalog-content\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.603361 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-utilities\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.632233 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2hjx\" (UniqueName: \"kubernetes.io/projected/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-kube-api-access-w2hjx\") pod \"redhat-marketplace-krz7p\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.797076 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.839749 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7h7f5"] Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.841655 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.853325 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7h7f5"] Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.878024 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"63871d231da6b4324963878d180336a36136252f4db8296061ae6306b927c2a8"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.878077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9914385c93dddd4cf19d6f8edb3dc2754ecd7a4dbde267301d01d9ccb25278c4"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.881759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a6711f3c26f0051fff05c1b05abbb3ad6982927243fd4abec8ad7acf53b608c4"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.882877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.884598 4837 generic.go:334] "Generic (PLEG): container finished" podID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerID="e2af15f81c3483493f539f5dd73c5b4e0be3b887917e590ab777bd04a571cc79" exitCode=0 Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.884655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh2lz" event={"ID":"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4","Type":"ContainerDied","Data":"e2af15f81c3483493f539f5dd73c5b4e0be3b887917e590ab777bd04a571cc79"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.888969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" event={"ID":"14165edd-b69a-4886-8405-09298571b47b","Type":"ContainerStarted","Data":"b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.889004 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" event={"ID":"14165edd-b69a-4886-8405-09298571b47b","Type":"ContainerStarted","Data":"9ce717d5f660a9b662fc6a8bac7451a44b82c6c2b93de616af1d2118d51d2b64"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.889513 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.921943 4837 generic.go:334] "Generic (PLEG): container finished" podID="f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" containerID="c0160032115176d3853cc8ec051c96849cb3424aee5ed74716737b2714a99d57" exitCode=0 Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.922009 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" event={"ID":"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5","Type":"ContainerDied","Data":"c0160032115176d3853cc8ec051c96849cb3424aee5ed74716737b2714a99d57"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.924537 4837 generic.go:334] "Generic (PLEG): container finished" podID="eb913442-9b33-4a88-a111-164879c37512" containerID="0866912271ca49bec8d452b3ab4ba5af143d4c35befadeab968d91ef36f158d3" exitCode=0 Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.925500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerDied","Data":"0866912271ca49bec8d452b3ab4ba5af143d4c35befadeab968d91ef36f158d3"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.925521 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerStarted","Data":"fee282758787035287c5797b676a1d10b5d0f9ab9ad37ee15b88d485257bd1c7"} Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.935081 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" podStartSLOduration=128.935070731 podStartE2EDuration="2m8.935070731s" podCreationTimestamp="2025-10-14 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:31.934398124 +0000 UTC m=+149.851397937" watchObservedRunningTime="2025-10-14 13:03:31.935070731 +0000 UTC m=+149.852070534" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.946460 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:31 crc kubenswrapper[4837]: I1014 13:03:31.962565 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-k88lw" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.012795 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-catalog-content\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.013022 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-utilities\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.013063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnxgc\" (UniqueName: \"kubernetes.io/projected/351b9e6b-6146-435e-89db-484d35087b98-kube-api-access-rnxgc\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.115947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-utilities\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.116228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnxgc\" (UniqueName: \"kubernetes.io/projected/351b9e6b-6146-435e-89db-484d35087b98-kube-api-access-rnxgc\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.116388 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-catalog-content\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.117020 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-utilities\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.117755 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-catalog-content\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.165310 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krz7p"] Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.173679 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnxgc\" (UniqueName: \"kubernetes.io/projected/351b9e6b-6146-435e-89db-484d35087b98-kube-api-access-rnxgc\") pod \"redhat-marketplace-7h7f5\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.173929 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.359440 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:32 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:32 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:32 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.359497 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.404139 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7h7f5"] Oct 14 13:03:32 crc kubenswrapper[4837]: W1014 13:03:32.454556 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351b9e6b_6146_435e_89db_484d35087b98.slice/crio-a752ac56779f1c14b420ad1f29ac371b2a8b36f681739531c86a42f4660b56c3 WatchSource:0}: Error finding container a752ac56779f1c14b420ad1f29ac371b2a8b36f681739531c86a42f4660b56c3: Status 404 returned error can't find the container with id a752ac56779f1c14b420ad1f29ac371b2a8b36f681739531c86a42f4660b56c3 Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.630193 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f27qg"] Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.631239 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.633879 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.642227 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f27qg"] Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.731607 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-catalog-content\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.731655 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsmg8\" (UniqueName: \"kubernetes.io/projected/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-kube-api-access-vsmg8\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.731815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-utilities\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.827582 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.834423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-catalog-content\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.834483 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsmg8\" (UniqueName: \"kubernetes.io/projected/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-kube-api-access-vsmg8\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.834518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-utilities\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.835263 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-utilities\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.835366 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-catalog-content\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.863463 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsmg8\" (UniqueName: \"kubernetes.io/projected/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-kube-api-access-vsmg8\") pod \"redhat-operators-f27qg\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.932716 4837 generic.go:334] "Generic (PLEG): container finished" podID="351b9e6b-6146-435e-89db-484d35087b98" containerID="1b40820fbddabc6a0c88b20fb330ffd83f63396e8825f78cda6fc5d4f49fdb82" exitCode=0 Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.932770 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7h7f5" event={"ID":"351b9e6b-6146-435e-89db-484d35087b98","Type":"ContainerDied","Data":"1b40820fbddabc6a0c88b20fb330ffd83f63396e8825f78cda6fc5d4f49fdb82"} Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.933115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7h7f5" event={"ID":"351b9e6b-6146-435e-89db-484d35087b98","Type":"ContainerStarted","Data":"a752ac56779f1c14b420ad1f29ac371b2a8b36f681739531c86a42f4660b56c3"} Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.934828 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerID="e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5" exitCode=0 Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.934926 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krz7p" event={"ID":"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81","Type":"ContainerDied","Data":"e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5"} Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.934978 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krz7p" event={"ID":"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81","Type":"ContainerStarted","Data":"dd0934477a5b0ad67d23adcab9efc4069d4b6f9d4766c4d4280b1e0cc28a2fa7"} Oct 14 13:03:32 crc kubenswrapper[4837]: I1014 13:03:32.972722 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.031364 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qh9xr"] Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.033085 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.052762 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qh9xr"] Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.071845 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.072488 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.075318 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.075416 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.103336 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.147994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8h5g\" (UniqueName: \"kubernetes.io/projected/73944cee-f5d1-491f-bfb5-5663eae0c27b-kube-api-access-b8h5g\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.148038 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-catalog-content\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.148076 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-utilities\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.249455 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-utilities\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.249552 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd83f994-c75d-4e67-a785-b248a51e8986-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.249582 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd83f994-c75d-4e67-a785-b248a51e8986-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.249606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8h5g\" (UniqueName: \"kubernetes.io/projected/73944cee-f5d1-491f-bfb5-5663eae0c27b-kube-api-access-b8h5g\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.249622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-catalog-content\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.250079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-catalog-content\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.250298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-utilities\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.267908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8h5g\" (UniqueName: \"kubernetes.io/projected/73944cee-f5d1-491f-bfb5-5663eae0c27b-kube-api-access-b8h5g\") pod \"redhat-operators-qh9xr\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.280147 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f27qg"] Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.282374 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:33 crc kubenswrapper[4837]: W1014 13:03:33.333503 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod020802bf_a0e7_44c0_b0d0_0d1f7d66eb35.slice/crio-3b61c56401fcd63334adfd9fa78dd6acf25b953ea4a3524b7df9562b9b7e1d7c WatchSource:0}: Error finding container 3b61c56401fcd63334adfd9fa78dd6acf25b953ea4a3524b7df9562b9b7e1d7c: Status 404 returned error can't find the container with id 3b61c56401fcd63334adfd9fa78dd6acf25b953ea4a3524b7df9562b9b7e1d7c Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.350445 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd83f994-c75d-4e67-a785-b248a51e8986-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.350488 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd83f994-c75d-4e67-a785-b248a51e8986-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.350748 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd83f994-c75d-4e67-a785-b248a51e8986-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.358611 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:33 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:33 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:33 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.358644 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.365449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd83f994-c75d-4e67-a785-b248a51e8986-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.393425 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.404844 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.453869 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tzgh\" (UniqueName: \"kubernetes.io/projected/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-kube-api-access-2tzgh\") pod \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.453938 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-secret-volume\") pod \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.454033 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-config-volume\") pod \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\" (UID: \"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5\") " Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.455336 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" (UID: "f71c72b3-1e65-4ba0-b0c9-e1faaed535a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.458103 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-kube-api-access-2tzgh" (OuterVolumeSpecName: "kube-api-access-2tzgh") pod "f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" (UID: "f71c72b3-1e65-4ba0-b0c9-e1faaed535a5"). InnerVolumeSpecName "kube-api-access-2tzgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.458470 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" (UID: "f71c72b3-1e65-4ba0-b0c9-e1faaed535a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.555947 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tzgh\" (UniqueName: \"kubernetes.io/projected/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-kube-api-access-2tzgh\") on node \"crc\" DevicePath \"\"" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.555970 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.555978 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.650440 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qh9xr"] Oct 14 13:03:33 crc kubenswrapper[4837]: W1014 13:03:33.667600 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73944cee_f5d1_491f_bfb5_5663eae0c27b.slice/crio-f8f0bfb189477fdee2f6ecd4310e8f35747c51fb454de1c216f95701c80756e3 WatchSource:0}: Error finding container f8f0bfb189477fdee2f6ecd4310e8f35747c51fb454de1c216f95701c80756e3: Status 404 returned error can't find the container with id f8f0bfb189477fdee2f6ecd4310e8f35747c51fb454de1c216f95701c80756e3 Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.691800 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-kjs2b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.691887 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kjs2b" podUID="1ad70ef2-45cd-4139-a60e-0bda62597cb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.691812 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-kjs2b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.692077 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-kjs2b" podUID="1ad70ef2-45cd-4139-a60e-0bda62597cb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 14 13:03:33 crc kubenswrapper[4837]: I1014 13:03:33.937516 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 14 13:03:33 crc kubenswrapper[4837]: E1014 13:03:33.965462 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73944cee_f5d1_491f_bfb5_5663eae0c27b.slice/crio-a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8.scope\": RecentStats: unable to find data in memory cache]" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.022202 4837 generic.go:334] "Generic (PLEG): container finished" podID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerID="a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8" exitCode=0 Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.022288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qh9xr" event={"ID":"73944cee-f5d1-491f-bfb5-5663eae0c27b","Type":"ContainerDied","Data":"a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8"} Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.022316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qh9xr" event={"ID":"73944cee-f5d1-491f-bfb5-5663eae0c27b","Type":"ContainerStarted","Data":"f8f0bfb189477fdee2f6ecd4310e8f35747c51fb454de1c216f95701c80756e3"} Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.022372 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.022920 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.024622 4837 patch_prober.go:28] interesting pod/console-f9d7485db-6vd5d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.024669 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6vd5d" podUID="fb47e83f-903a-4420-9741-645bbbdf63c4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.025131 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" event={"ID":"f71c72b3-1e65-4ba0-b0c9-e1faaed535a5","Type":"ContainerDied","Data":"7258dee44d3606211a2853abcec1932ac258021a166fc4a97ea972e9542ae34e"} Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.025185 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7258dee44d3606211a2853abcec1932ac258021a166fc4a97ea972e9542ae34e" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.025258 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.028381 4837 generic.go:334] "Generic (PLEG): container finished" podID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerID="c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e" exitCode=0 Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.028516 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerDied","Data":"c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e"} Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.028549 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerStarted","Data":"3b61c56401fcd63334adfd9fa78dd6acf25b953ea4a3524b7df9562b9b7e1d7c"} Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.356015 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.359918 4837 patch_prober.go:28] interesting pod/router-default-5444994796-94j6s container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:03:34 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Oct 14 13:03:34 crc kubenswrapper[4837]: [+]process-running ok Oct 14 13:03:34 crc kubenswrapper[4837]: healthz check failed Oct 14 13:03:34 crc kubenswrapper[4837]: I1014 13:03:34.359979 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-94j6s" podUID="341ec536-bf38-4225-a0bc-da7f4837cdbc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:03:35 crc kubenswrapper[4837]: I1014 13:03:35.038106 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd83f994-c75d-4e67-a785-b248a51e8986","Type":"ContainerStarted","Data":"9ad7cf86d578c6186095077c74cab463670467f76ef05366f439f24ee4366243"} Oct 14 13:03:35 crc kubenswrapper[4837]: I1014 13:03:35.038167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd83f994-c75d-4e67-a785-b248a51e8986","Type":"ContainerStarted","Data":"29f4bde05bbc64c7b8fe4686c160b3c4519cccc6b19c77f5e6c85d5a284d9ffb"} Oct 14 13:03:35 crc kubenswrapper[4837]: I1014 13:03:35.056518 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.056500545 podStartE2EDuration="2.056500545s" podCreationTimestamp="2025-10-14 13:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:35.052060145 +0000 UTC m=+152.969059958" watchObservedRunningTime="2025-10-14 13:03:35.056500545 +0000 UTC m=+152.973500358" Oct 14 13:03:35 crc kubenswrapper[4837]: I1014 13:03:35.358754 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:35 crc kubenswrapper[4837]: I1014 13:03:35.361795 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-94j6s" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.087393 4837 generic.go:334] "Generic (PLEG): container finished" podID="bd83f994-c75d-4e67-a785-b248a51e8986" containerID="9ad7cf86d578c6186095077c74cab463670467f76ef05366f439f24ee4366243" exitCode=0 Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.088177 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd83f994-c75d-4e67-a785-b248a51e8986","Type":"ContainerDied","Data":"9ad7cf86d578c6186095077c74cab463670467f76ef05366f439f24ee4366243"} Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.412091 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 14 13:03:36 crc kubenswrapper[4837]: E1014 13:03:36.412325 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" containerName="collect-profiles" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.412341 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" containerName="collect-profiles" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.412478 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" containerName="collect-profiles" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.412836 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.416607 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.416751 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.435360 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.512632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fceb6b-af85-4aeb-9321-389fe647b886-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.512714 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fceb6b-af85-4aeb-9321-389fe647b886-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.614387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fceb6b-af85-4aeb-9321-389fe647b886-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.614501 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fceb6b-af85-4aeb-9321-389fe647b886-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.614515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fceb6b-af85-4aeb-9321-389fe647b886-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.645597 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fceb6b-af85-4aeb-9321-389fe647b886-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:36 crc kubenswrapper[4837]: I1014 13:03:36.761395 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:03:38 crc kubenswrapper[4837]: I1014 13:03:38.391985 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:03:39 crc kubenswrapper[4837]: I1014 13:03:39.487549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tl2qn" Oct 14 13:03:41 crc kubenswrapper[4837]: I1014 13:03:41.140320 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:03:41 crc kubenswrapper[4837]: I1014 13:03:41.140595 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.633829 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.753244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd83f994-c75d-4e67-a785-b248a51e8986-kube-api-access\") pod \"bd83f994-c75d-4e67-a785-b248a51e8986\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.753403 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd83f994-c75d-4e67-a785-b248a51e8986-kubelet-dir\") pod \"bd83f994-c75d-4e67-a785-b248a51e8986\" (UID: \"bd83f994-c75d-4e67-a785-b248a51e8986\") " Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.753504 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd83f994-c75d-4e67-a785-b248a51e8986-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd83f994-c75d-4e67-a785-b248a51e8986" (UID: "bd83f994-c75d-4e67-a785-b248a51e8986"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.753562 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd83f994-c75d-4e67-a785-b248a51e8986-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.762904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd83f994-c75d-4e67-a785-b248a51e8986-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd83f994-c75d-4e67-a785-b248a51e8986" (UID: "bd83f994-c75d-4e67-a785-b248a51e8986"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:03:42 crc kubenswrapper[4837]: I1014 13:03:42.854578 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd83f994-c75d-4e67-a785-b248a51e8986-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 14 13:03:43 crc kubenswrapper[4837]: I1014 13:03:43.146001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd83f994-c75d-4e67-a785-b248a51e8986","Type":"ContainerDied","Data":"29f4bde05bbc64c7b8fe4686c160b3c4519cccc6b19c77f5e6c85d5a284d9ffb"} Oct 14 13:03:43 crc kubenswrapper[4837]: I1014 13:03:43.146044 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 14 13:03:43 crc kubenswrapper[4837]: I1014 13:03:43.146109 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f4bde05bbc64c7b8fe4686c160b3c4519cccc6b19c77f5e6c85d5a284d9ffb" Oct 14 13:03:43 crc kubenswrapper[4837]: I1014 13:03:43.697516 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-kjs2b" Oct 14 13:03:44 crc kubenswrapper[4837]: I1014 13:03:44.046975 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:44 crc kubenswrapper[4837]: I1014 13:03:44.053726 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:03:44 crc kubenswrapper[4837]: I1014 13:03:44.615681 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 14 13:03:46 crc kubenswrapper[4837]: I1014 13:03:46.607433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:46 crc kubenswrapper[4837]: I1014 13:03:46.615800 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c934a24-9e12-46eb-851e-1a6925dc8909-metrics-certs\") pod \"network-metrics-daemon-pcpcf\" (UID: \"7c934a24-9e12-46eb-851e-1a6925dc8909\") " pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:46 crc kubenswrapper[4837]: I1014 13:03:46.814042 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pcpcf" Oct 14 13:03:48 crc kubenswrapper[4837]: W1014 13:03:48.929483 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod83fceb6b_af85_4aeb_9321_389fe647b886.slice/crio-7ec61e297826ef4c26ecd254480b2270fbd30097801b6fe1f73667c518301bdd WatchSource:0}: Error finding container 7ec61e297826ef4c26ecd254480b2270fbd30097801b6fe1f73667c518301bdd: Status 404 returned error can't find the container with id 7ec61e297826ef4c26ecd254480b2270fbd30097801b6fe1f73667c518301bdd Oct 14 13:03:49 crc kubenswrapper[4837]: I1014 13:03:49.181763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83fceb6b-af85-4aeb-9321-389fe647b886","Type":"ContainerStarted","Data":"7ec61e297826ef4c26ecd254480b2270fbd30097801b6fe1f73667c518301bdd"} Oct 14 13:03:51 crc kubenswrapper[4837]: I1014 13:03:51.216749 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:03:54 crc kubenswrapper[4837]: E1014 13:03:54.979355 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 14 13:03:54 crc kubenswrapper[4837]: E1014 13:03:54.979768 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvdpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2lq7f_openshift-marketplace(eb913442-9b33-4a88-a111-164879c37512): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 14 13:03:54 crc kubenswrapper[4837]: E1014 13:03:54.981245 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2lq7f" podUID="eb913442-9b33-4a88-a111-164879c37512" Oct 14 13:03:55 crc kubenswrapper[4837]: E1014 13:03:55.565740 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2lq7f" podUID="eb913442-9b33-4a88-a111-164879c37512" Oct 14 13:03:55 crc kubenswrapper[4837]: E1014 13:03:55.669494 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 14 13:03:55 crc kubenswrapper[4837]: E1014 13:03:55.669661 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8dqtm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-n8266_openshift-marketplace(49e47458-6044-4966-a0e5-3a8e243414f8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 14 13:03:55 crc kubenswrapper[4837]: E1014 13:03:55.671217 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-n8266" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" Oct 14 13:03:58 crc kubenswrapper[4837]: E1014 13:03:58.348454 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-n8266" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" Oct 14 13:03:58 crc kubenswrapper[4837]: I1014 13:03:58.562141 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pcpcf"] Oct 14 13:03:58 crc kubenswrapper[4837]: W1014 13:03:58.568599 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c934a24_9e12_46eb_851e_1a6925dc8909.slice/crio-68a80ceffc968c1d6af0c8151149ee5eb5c60a6ebdb42a1f4b830d31414b5bb8 WatchSource:0}: Error finding container 68a80ceffc968c1d6af0c8151149ee5eb5c60a6ebdb42a1f4b830d31414b5bb8: Status 404 returned error can't find the container with id 68a80ceffc968c1d6af0c8151149ee5eb5c60a6ebdb42a1f4b830d31414b5bb8 Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.242493 4837 generic.go:334] "Generic (PLEG): container finished" podID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerID="72f4e4eb369c2336ef0fb77f8535e038009a7fbe7d5e582eeca572ddf17c47b7" exitCode=0 Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.242590 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh2lz" event={"ID":"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4","Type":"ContainerDied","Data":"72f4e4eb369c2336ef0fb77f8535e038009a7fbe7d5e582eeca572ddf17c47b7"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.247643 4837 generic.go:334] "Generic (PLEG): container finished" podID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerID="5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911" exitCode=0 Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.247714 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qh9xr" event={"ID":"73944cee-f5d1-491f-bfb5-5663eae0c27b","Type":"ContainerDied","Data":"5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.249334 4837 generic.go:334] "Generic (PLEG): container finished" podID="351b9e6b-6146-435e-89db-484d35087b98" containerID="0490e48baabd7ead78ed6e5573f13566e99737e99d051b484b84928ecd1ddff3" exitCode=0 Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.249420 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7h7f5" event={"ID":"351b9e6b-6146-435e-89db-484d35087b98","Type":"ContainerDied","Data":"0490e48baabd7ead78ed6e5573f13566e99737e99d051b484b84928ecd1ddff3"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.252355 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerID="dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9" exitCode=0 Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.252413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krz7p" event={"ID":"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81","Type":"ContainerDied","Data":"dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.254352 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83fceb6b-af85-4aeb-9321-389fe647b886","Type":"ContainerStarted","Data":"2cb5a287b9648e00928e0f7a3fd379cc66a8a7a9e3c063d192aa5ece6436ab35"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.256567 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" event={"ID":"7c934a24-9e12-46eb-851e-1a6925dc8909","Type":"ContainerStarted","Data":"f1aa18876aacb079ca51eb8ae259fb075631d2f4e21da07a1c27295a795d4393"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.256586 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" event={"ID":"7c934a24-9e12-46eb-851e-1a6925dc8909","Type":"ContainerStarted","Data":"3ab3bfaefd128273b9797aa1be46f465091af54127ff6d8e6ae4079533c1ac18"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.256595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pcpcf" event={"ID":"7c934a24-9e12-46eb-851e-1a6925dc8909","Type":"ContainerStarted","Data":"68a80ceffc968c1d6af0c8151149ee5eb5c60a6ebdb42a1f4b830d31414b5bb8"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.262872 4837 generic.go:334] "Generic (PLEG): container finished" podID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerID="e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b" exitCode=0 Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.262965 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p5s4" event={"ID":"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9","Type":"ContainerDied","Data":"e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.268276 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerStarted","Data":"eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f"} Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.318605 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pcpcf" podStartSLOduration=155.318580094 podStartE2EDuration="2m35.318580094s" podCreationTimestamp="2025-10-14 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:59.316265732 +0000 UTC m=+177.233265555" watchObservedRunningTime="2025-10-14 13:03:59.318580094 +0000 UTC m=+177.235579907" Oct 14 13:03:59 crc kubenswrapper[4837]: I1014 13:03:59.423853 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=23.423752123 podStartE2EDuration="23.423752123s" podCreationTimestamp="2025-10-14 13:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:03:59.422390416 +0000 UTC m=+177.339390229" watchObservedRunningTime="2025-10-14 13:03:59.423752123 +0000 UTC m=+177.340751936" Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.054129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.278367 4837 generic.go:334] "Generic (PLEG): container finished" podID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerID="eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f" exitCode=0 Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.278426 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerDied","Data":"eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f"} Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.281735 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qh9xr" event={"ID":"73944cee-f5d1-491f-bfb5-5663eae0c27b","Type":"ContainerStarted","Data":"d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa"} Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.285058 4837 generic.go:334] "Generic (PLEG): container finished" podID="83fceb6b-af85-4aeb-9321-389fe647b886" containerID="2cb5a287b9648e00928e0f7a3fd379cc66a8a7a9e3c063d192aa5ece6436ab35" exitCode=0 Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.285201 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83fceb6b-af85-4aeb-9321-389fe647b886","Type":"ContainerDied","Data":"2cb5a287b9648e00928e0f7a3fd379cc66a8a7a9e3c063d192aa5ece6436ab35"} Oct 14 13:04:00 crc kubenswrapper[4837]: I1014 13:04:00.302804 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qh9xr" podStartSLOduration=1.6562779650000001 podStartE2EDuration="27.302777254s" podCreationTimestamp="2025-10-14 13:03:33 +0000 UTC" firstStartedPulling="2025-10-14 13:03:34.046368096 +0000 UTC m=+151.963367909" lastFinishedPulling="2025-10-14 13:03:59.692867345 +0000 UTC m=+177.609867198" observedRunningTime="2025-10-14 13:04:00.299909717 +0000 UTC m=+178.216909550" watchObservedRunningTime="2025-10-14 13:04:00.302777254 +0000 UTC m=+178.219777077" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.292777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh2lz" event={"ID":"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4","Type":"ContainerStarted","Data":"d22573fe5136a8e42a600638cc85de9e74b26861c1ca735d6dc5498171548621"} Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.295411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7h7f5" event={"ID":"351b9e6b-6146-435e-89db-484d35087b98","Type":"ContainerStarted","Data":"fbbe9069bb422f1a0dbe396367b1c5e5ee3044103e3c2618ab73173ab9e67f56"} Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.297869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krz7p" event={"ID":"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81","Type":"ContainerStarted","Data":"b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc"} Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.299741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p5s4" event={"ID":"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9","Type":"ContainerStarted","Data":"39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972"} Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.303414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerStarted","Data":"33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d"} Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.316185 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vh2lz" podStartSLOduration=3.6569972120000003 podStartE2EDuration="32.31617054s" podCreationTimestamp="2025-10-14 13:03:29 +0000 UTC" firstStartedPulling="2025-10-14 13:03:31.887414515 +0000 UTC m=+149.804414338" lastFinishedPulling="2025-10-14 13:04:00.546587823 +0000 UTC m=+178.463587666" observedRunningTime="2025-10-14 13:04:01.314315201 +0000 UTC m=+179.231315014" watchObservedRunningTime="2025-10-14 13:04:01.31617054 +0000 UTC m=+179.233170353" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.335083 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f27qg" podStartSLOduration=3.353663903 podStartE2EDuration="29.33505317s" podCreationTimestamp="2025-10-14 13:03:32 +0000 UTC" firstStartedPulling="2025-10-14 13:03:34.046688624 +0000 UTC m=+151.963688437" lastFinishedPulling="2025-10-14 13:04:00.028077881 +0000 UTC m=+177.945077704" observedRunningTime="2025-10-14 13:04:01.335001318 +0000 UTC m=+179.252001131" watchObservedRunningTime="2025-10-14 13:04:01.33505317 +0000 UTC m=+179.252052983" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.352680 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krz7p" podStartSLOduration=3.321650054 podStartE2EDuration="30.352664895s" podCreationTimestamp="2025-10-14 13:03:31 +0000 UTC" firstStartedPulling="2025-10-14 13:03:32.943526135 +0000 UTC m=+150.860525948" lastFinishedPulling="2025-10-14 13:03:59.974540986 +0000 UTC m=+177.891540789" observedRunningTime="2025-10-14 13:04:01.35173889 +0000 UTC m=+179.268738723" watchObservedRunningTime="2025-10-14 13:04:01.352664895 +0000 UTC m=+179.269664708" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.373250 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7h7f5" podStartSLOduration=3.315208471 podStartE2EDuration="30.373234641s" podCreationTimestamp="2025-10-14 13:03:31 +0000 UTC" firstStartedPulling="2025-10-14 13:03:32.943567686 +0000 UTC m=+150.860567499" lastFinishedPulling="2025-10-14 13:04:00.001593866 +0000 UTC m=+177.918593669" observedRunningTime="2025-10-14 13:04:01.370967639 +0000 UTC m=+179.287967452" watchObservedRunningTime="2025-10-14 13:04:01.373234641 +0000 UTC m=+179.290234454" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.632274 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.647303 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p5s4" podStartSLOduration=3.127824473 podStartE2EDuration="32.647285156s" podCreationTimestamp="2025-10-14 13:03:29 +0000 UTC" firstStartedPulling="2025-10-14 13:03:30.843021592 +0000 UTC m=+148.760021405" lastFinishedPulling="2025-10-14 13:04:00.362482265 +0000 UTC m=+178.279482088" observedRunningTime="2025-10-14 13:04:01.397837404 +0000 UTC m=+179.314837217" watchObservedRunningTime="2025-10-14 13:04:01.647285156 +0000 UTC m=+179.564284969" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.649023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fceb6b-af85-4aeb-9321-389fe647b886-kubelet-dir\") pod \"83fceb6b-af85-4aeb-9321-389fe647b886\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.649110 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83fceb6b-af85-4aeb-9321-389fe647b886-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "83fceb6b-af85-4aeb-9321-389fe647b886" (UID: "83fceb6b-af85-4aeb-9321-389fe647b886"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.649193 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fceb6b-af85-4aeb-9321-389fe647b886-kube-api-access\") pod \"83fceb6b-af85-4aeb-9321-389fe647b886\" (UID: \"83fceb6b-af85-4aeb-9321-389fe647b886\") " Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.649441 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fceb6b-af85-4aeb-9321-389fe647b886-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.657815 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83fceb6b-af85-4aeb-9321-389fe647b886-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "83fceb6b-af85-4aeb-9321-389fe647b886" (UID: "83fceb6b-af85-4aeb-9321-389fe647b886"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.750581 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fceb6b-af85-4aeb-9321-389fe647b886-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.797748 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.797807 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:04:01 crc kubenswrapper[4837]: I1014 13:04:01.958474 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.174566 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.174616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.216431 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.309047 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83fceb6b-af85-4aeb-9321-389fe647b886","Type":"ContainerDied","Data":"7ec61e297826ef4c26ecd254480b2270fbd30097801b6fe1f73667c518301bdd"} Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.309076 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.309098 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ec61e297826ef4c26ecd254480b2270fbd30097801b6fe1f73667c518301bdd" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.973028 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:04:02 crc kubenswrapper[4837]: I1014 13:04:02.973468 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:04:03 crc kubenswrapper[4837]: I1014 13:04:03.394355 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:04:03 crc kubenswrapper[4837]: I1014 13:04:03.395373 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:04:04 crc kubenswrapper[4837]: I1014 13:04:04.055389 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f27qg" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="registry-server" probeResult="failure" output=< Oct 14 13:04:04 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Oct 14 13:04:04 crc kubenswrapper[4837]: > Oct 14 13:04:04 crc kubenswrapper[4837]: I1014 13:04:04.440436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p4hjj" Oct 14 13:04:04 crc kubenswrapper[4837]: I1014 13:04:04.450489 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qh9xr" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="registry-server" probeResult="failure" output=< Oct 14 13:04:04 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Oct 14 13:04:04 crc kubenswrapper[4837]: > Oct 14 13:04:09 crc kubenswrapper[4837]: I1014 13:04:09.350004 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerStarted","Data":"ccc385c2d1cc65f766d3d40ab5801f954329b67a51a2a92be263a66d362600aa"} Oct 14 13:04:09 crc kubenswrapper[4837]: I1014 13:04:09.968785 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:04:09 crc kubenswrapper[4837]: I1014 13:04:09.968849 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.024026 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.359271 4837 generic.go:334] "Generic (PLEG): container finished" podID="eb913442-9b33-4a88-a111-164879c37512" containerID="ccc385c2d1cc65f766d3d40ab5801f954329b67a51a2a92be263a66d362600aa" exitCode=0 Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.359332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerDied","Data":"ccc385c2d1cc65f766d3d40ab5801f954329b67a51a2a92be263a66d362600aa"} Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.434933 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.515767 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.515844 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:04:10 crc kubenswrapper[4837]: I1014 13:04:10.580399 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:04:11 crc kubenswrapper[4837]: I1014 13:04:11.140531 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:04:11 crc kubenswrapper[4837]: I1014 13:04:11.141195 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:04:11 crc kubenswrapper[4837]: I1014 13:04:11.464091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:04:11 crc kubenswrapper[4837]: I1014 13:04:11.861907 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:04:12 crc kubenswrapper[4837]: I1014 13:04:12.243803 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:04:12 crc kubenswrapper[4837]: I1014 13:04:12.893749 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vh2lz"] Oct 14 13:04:13 crc kubenswrapper[4837]: I1014 13:04:13.033914 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:04:13 crc kubenswrapper[4837]: I1014 13:04:13.109294 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:04:13 crc kubenswrapper[4837]: I1014 13:04:13.377063 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vh2lz" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="registry-server" containerID="cri-o://d22573fe5136a8e42a600638cc85de9e74b26861c1ca735d6dc5498171548621" gracePeriod=2 Oct 14 13:04:13 crc kubenswrapper[4837]: I1014 13:04:13.450072 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:04:13 crc kubenswrapper[4837]: I1014 13:04:13.508938 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.283722 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7h7f5"] Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.285076 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7h7f5" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="registry-server" containerID="cri-o://fbbe9069bb422f1a0dbe396367b1c5e5ee3044103e3c2618ab73173ab9e67f56" gracePeriod=2 Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.385275 4837 generic.go:334] "Generic (PLEG): container finished" podID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerID="d22573fe5136a8e42a600638cc85de9e74b26861c1ca735d6dc5498171548621" exitCode=0 Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.385353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh2lz" event={"ID":"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4","Type":"ContainerDied","Data":"d22573fe5136a8e42a600638cc85de9e74b26861c1ca735d6dc5498171548621"} Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.388472 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerStarted","Data":"1025f4d0102daf1c6c0ba3652e718896497c62c653d0b77f542a7bae643b35e8"} Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.412295 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2lq7f" podStartSLOduration=3.078897598 podStartE2EDuration="44.412275532s" podCreationTimestamp="2025-10-14 13:03:30 +0000 UTC" firstStartedPulling="2025-10-14 13:03:31.92648154 +0000 UTC m=+149.843481353" lastFinishedPulling="2025-10-14 13:04:13.259859474 +0000 UTC m=+191.176859287" observedRunningTime="2025-10-14 13:04:14.409768164 +0000 UTC m=+192.326768007" watchObservedRunningTime="2025-10-14 13:04:14.412275532 +0000 UTC m=+192.329275345" Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.806854 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.934370 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-catalog-content\") pod \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.934445 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhxw\" (UniqueName: \"kubernetes.io/projected/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-kube-api-access-kwhxw\") pod \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.934479 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-utilities\") pod \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\" (UID: \"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4\") " Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.935634 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-utilities" (OuterVolumeSpecName: "utilities") pod "a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" (UID: "a5e629bc-e6dd-4cb8-9a73-b3c189d560e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:14 crc kubenswrapper[4837]: I1014 13:04:14.943553 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-kube-api-access-kwhxw" (OuterVolumeSpecName: "kube-api-access-kwhxw") pod "a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" (UID: "a5e629bc-e6dd-4cb8-9a73-b3c189d560e4"). InnerVolumeSpecName "kube-api-access-kwhxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.001135 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" (UID: "a5e629bc-e6dd-4cb8-9a73-b3c189d560e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.035845 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.035878 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.035894 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhxw\" (UniqueName: \"kubernetes.io/projected/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4-kube-api-access-kwhxw\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.408934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vh2lz" event={"ID":"a5e629bc-e6dd-4cb8-9a73-b3c189d560e4","Type":"ContainerDied","Data":"35b0428104c0bbfccae6379c066ce477a4ef32d9e5a5953a0c7555453bf0ab15"} Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.408999 4837 scope.go:117] "RemoveContainer" containerID="d22573fe5136a8e42a600638cc85de9e74b26861c1ca735d6dc5498171548621" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.409143 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vh2lz" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.416042 4837 generic.go:334] "Generic (PLEG): container finished" podID="351b9e6b-6146-435e-89db-484d35087b98" containerID="fbbe9069bb422f1a0dbe396367b1c5e5ee3044103e3c2618ab73173ab9e67f56" exitCode=0 Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.416119 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7h7f5" event={"ID":"351b9e6b-6146-435e-89db-484d35087b98","Type":"ContainerDied","Data":"fbbe9069bb422f1a0dbe396367b1c5e5ee3044103e3c2618ab73173ab9e67f56"} Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.424523 4837 scope.go:117] "RemoveContainer" containerID="72f4e4eb369c2336ef0fb77f8535e038009a7fbe7d5e582eeca572ddf17c47b7" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.449554 4837 scope.go:117] "RemoveContainer" containerID="e2af15f81c3483493f539f5dd73c5b4e0be3b887917e590ab777bd04a571cc79" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.457288 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vh2lz"] Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.472904 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vh2lz"] Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.796000 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.851124 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-catalog-content\") pod \"351b9e6b-6146-435e-89db-484d35087b98\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.851234 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-utilities\") pod \"351b9e6b-6146-435e-89db-484d35087b98\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.851358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnxgc\" (UniqueName: \"kubernetes.io/projected/351b9e6b-6146-435e-89db-484d35087b98-kube-api-access-rnxgc\") pod \"351b9e6b-6146-435e-89db-484d35087b98\" (UID: \"351b9e6b-6146-435e-89db-484d35087b98\") " Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.851989 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-utilities" (OuterVolumeSpecName: "utilities") pod "351b9e6b-6146-435e-89db-484d35087b98" (UID: "351b9e6b-6146-435e-89db-484d35087b98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.854630 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351b9e6b-6146-435e-89db-484d35087b98-kube-api-access-rnxgc" (OuterVolumeSpecName: "kube-api-access-rnxgc") pod "351b9e6b-6146-435e-89db-484d35087b98" (UID: "351b9e6b-6146-435e-89db-484d35087b98"). InnerVolumeSpecName "kube-api-access-rnxgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.869419 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "351b9e6b-6146-435e-89db-484d35087b98" (UID: "351b9e6b-6146-435e-89db-484d35087b98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.954100 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.954187 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351b9e6b-6146-435e-89db-484d35087b98-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:15 crc kubenswrapper[4837]: I1014 13:04:15.954203 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnxgc\" (UniqueName: \"kubernetes.io/projected/351b9e6b-6146-435e-89db-484d35087b98-kube-api-access-rnxgc\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.430353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerStarted","Data":"b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f"} Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.433932 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7h7f5" event={"ID":"351b9e6b-6146-435e-89db-484d35087b98","Type":"ContainerDied","Data":"a752ac56779f1c14b420ad1f29ac371b2a8b36f681739531c86a42f4660b56c3"} Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.433988 4837 scope.go:117] "RemoveContainer" containerID="fbbe9069bb422f1a0dbe396367b1c5e5ee3044103e3c2618ab73173ab9e67f56" Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.434088 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7h7f5" Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.462541 4837 scope.go:117] "RemoveContainer" containerID="0490e48baabd7ead78ed6e5573f13566e99737e99d051b484b84928ecd1ddff3" Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.469601 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7h7f5"] Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.475240 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7h7f5"] Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.479138 4837 scope.go:117] "RemoveContainer" containerID="1b40820fbddabc6a0c88b20fb330ffd83f63396e8825f78cda6fc5d4f49fdb82" Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.684533 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qh9xr"] Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.684783 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qh9xr" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="registry-server" containerID="cri-o://d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa" gracePeriod=2 Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.790377 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351b9e6b-6146-435e-89db-484d35087b98" path="/var/lib/kubelet/pods/351b9e6b-6146-435e-89db-484d35087b98/volumes" Oct 14 13:04:16 crc kubenswrapper[4837]: I1014 13:04:16.791297 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" path="/var/lib/kubelet/pods/a5e629bc-e6dd-4cb8-9a73-b3c189d560e4/volumes" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.042273 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.069203 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-catalog-content\") pod \"73944cee-f5d1-491f-bfb5-5663eae0c27b\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.069310 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-utilities\") pod \"73944cee-f5d1-491f-bfb5-5663eae0c27b\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.069372 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8h5g\" (UniqueName: \"kubernetes.io/projected/73944cee-f5d1-491f-bfb5-5663eae0c27b-kube-api-access-b8h5g\") pod \"73944cee-f5d1-491f-bfb5-5663eae0c27b\" (UID: \"73944cee-f5d1-491f-bfb5-5663eae0c27b\") " Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.071988 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-utilities" (OuterVolumeSpecName: "utilities") pod "73944cee-f5d1-491f-bfb5-5663eae0c27b" (UID: "73944cee-f5d1-491f-bfb5-5663eae0c27b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.077383 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73944cee-f5d1-491f-bfb5-5663eae0c27b-kube-api-access-b8h5g" (OuterVolumeSpecName: "kube-api-access-b8h5g") pod "73944cee-f5d1-491f-bfb5-5663eae0c27b" (UID: "73944cee-f5d1-491f-bfb5-5663eae0c27b"). InnerVolumeSpecName "kube-api-access-b8h5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.155304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73944cee-f5d1-491f-bfb5-5663eae0c27b" (UID: "73944cee-f5d1-491f-bfb5-5663eae0c27b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.172737 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.172771 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8h5g\" (UniqueName: \"kubernetes.io/projected/73944cee-f5d1-491f-bfb5-5663eae0c27b-kube-api-access-b8h5g\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.172783 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73944cee-f5d1-491f-bfb5-5663eae0c27b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.446063 4837 generic.go:334] "Generic (PLEG): container finished" podID="49e47458-6044-4966-a0e5-3a8e243414f8" containerID="b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f" exitCode=0 Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.446213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerDied","Data":"b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f"} Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.451232 4837 generic.go:334] "Generic (PLEG): container finished" podID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerID="d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa" exitCode=0 Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.451256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qh9xr" event={"ID":"73944cee-f5d1-491f-bfb5-5663eae0c27b","Type":"ContainerDied","Data":"d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa"} Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.451274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qh9xr" event={"ID":"73944cee-f5d1-491f-bfb5-5663eae0c27b","Type":"ContainerDied","Data":"f8f0bfb189477fdee2f6ecd4310e8f35747c51fb454de1c216f95701c80756e3"} Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.451290 4837 scope.go:117] "RemoveContainer" containerID="d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.451362 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qh9xr" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.492370 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qh9xr"] Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.493373 4837 scope.go:117] "RemoveContainer" containerID="5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.496975 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qh9xr"] Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.528135 4837 scope.go:117] "RemoveContainer" containerID="a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.551501 4837 scope.go:117] "RemoveContainer" containerID="d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa" Oct 14 13:04:17 crc kubenswrapper[4837]: E1014 13:04:17.551938 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa\": container with ID starting with d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa not found: ID does not exist" containerID="d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.551965 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa"} err="failed to get container status \"d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa\": rpc error: code = NotFound desc = could not find container \"d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa\": container with ID starting with d3590ab6bc06750e767ea4b5ffa98e063e6ff7d97b391ed0c4414bcc194c9bfa not found: ID does not exist" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.552021 4837 scope.go:117] "RemoveContainer" containerID="5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911" Oct 14 13:04:17 crc kubenswrapper[4837]: E1014 13:04:17.552403 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911\": container with ID starting with 5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911 not found: ID does not exist" containerID="5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.552423 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911"} err="failed to get container status \"5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911\": rpc error: code = NotFound desc = could not find container \"5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911\": container with ID starting with 5e7ec29b8565bbdb23b08eac03bf91cdbf6ab9c611a9c41a9f4e096e840e5911 not found: ID does not exist" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.552435 4837 scope.go:117] "RemoveContainer" containerID="a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8" Oct 14 13:04:17 crc kubenswrapper[4837]: E1014 13:04:17.552650 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8\": container with ID starting with a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8 not found: ID does not exist" containerID="a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8" Oct 14 13:04:17 crc kubenswrapper[4837]: I1014 13:04:17.552672 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8"} err="failed to get container status \"a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8\": rpc error: code = NotFound desc = could not find container \"a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8\": container with ID starting with a3258b13d36285e2c92c4a2cb77f631179abf4cae947b0619f0a3a34fdc1b1e8 not found: ID does not exist" Oct 14 13:04:18 crc kubenswrapper[4837]: I1014 13:04:18.466727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerStarted","Data":"15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16"} Oct 14 13:04:18 crc kubenswrapper[4837]: I1014 13:04:18.790318 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" path="/var/lib/kubelet/pods/73944cee-f5d1-491f-bfb5-5663eae0c27b/volumes" Oct 14 13:04:19 crc kubenswrapper[4837]: I1014 13:04:19.762058 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:04:19 crc kubenswrapper[4837]: I1014 13:04:19.762349 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:04:20 crc kubenswrapper[4837]: I1014 13:04:20.425210 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:04:20 crc kubenswrapper[4837]: I1014 13:04:20.425591 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:04:20 crc kubenswrapper[4837]: I1014 13:04:20.491921 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:04:20 crc kubenswrapper[4837]: I1014 13:04:20.509085 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n8266" podStartSLOduration=4.361887462 podStartE2EDuration="51.509067543s" podCreationTimestamp="2025-10-14 13:03:29 +0000 UTC" firstStartedPulling="2025-10-14 13:03:30.837512593 +0000 UTC m=+148.754512406" lastFinishedPulling="2025-10-14 13:04:17.984692634 +0000 UTC m=+195.901692487" observedRunningTime="2025-10-14 13:04:18.498641143 +0000 UTC m=+196.415640996" watchObservedRunningTime="2025-10-14 13:04:20.509067543 +0000 UTC m=+198.426067366" Oct 14 13:04:20 crc kubenswrapper[4837]: I1014 13:04:20.548908 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:04:20 crc kubenswrapper[4837]: I1014 13:04:20.805071 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-n8266" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="registry-server" probeResult="failure" output=< Oct 14 13:04:20 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Oct 14 13:04:20 crc kubenswrapper[4837]: > Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.093929 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lq7f"] Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.097498 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2lq7f" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="registry-server" containerID="cri-o://1025f4d0102daf1c6c0ba3652e718896497c62c653d0b77f542a7bae643b35e8" gracePeriod=2 Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.497137 4837 generic.go:334] "Generic (PLEG): container finished" podID="eb913442-9b33-4a88-a111-164879c37512" containerID="1025f4d0102daf1c6c0ba3652e718896497c62c653d0b77f542a7bae643b35e8" exitCode=0 Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.497373 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerDied","Data":"1025f4d0102daf1c6c0ba3652e718896497c62c653d0b77f542a7bae643b35e8"} Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.497400 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lq7f" event={"ID":"eb913442-9b33-4a88-a111-164879c37512","Type":"ContainerDied","Data":"fee282758787035287c5797b676a1d10b5d0f9ab9ad37ee15b88d485257bd1c7"} Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.497413 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee282758787035287c5797b676a1d10b5d0f9ab9ad37ee15b88d485257bd1c7" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.515515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.555978 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvdpx\" (UniqueName: \"kubernetes.io/projected/eb913442-9b33-4a88-a111-164879c37512-kube-api-access-lvdpx\") pod \"eb913442-9b33-4a88-a111-164879c37512\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.556015 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-utilities\") pod \"eb913442-9b33-4a88-a111-164879c37512\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.556043 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-catalog-content\") pod \"eb913442-9b33-4a88-a111-164879c37512\" (UID: \"eb913442-9b33-4a88-a111-164879c37512\") " Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.556910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-utilities" (OuterVolumeSpecName: "utilities") pod "eb913442-9b33-4a88-a111-164879c37512" (UID: "eb913442-9b33-4a88-a111-164879c37512"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.564324 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb913442-9b33-4a88-a111-164879c37512-kube-api-access-lvdpx" (OuterVolumeSpecName: "kube-api-access-lvdpx") pod "eb913442-9b33-4a88-a111-164879c37512" (UID: "eb913442-9b33-4a88-a111-164879c37512"). InnerVolumeSpecName "kube-api-access-lvdpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.601633 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb913442-9b33-4a88-a111-164879c37512" (UID: "eb913442-9b33-4a88-a111-164879c37512"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.656732 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.656762 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvdpx\" (UniqueName: \"kubernetes.io/projected/eb913442-9b33-4a88-a111-164879c37512-kube-api-access-lvdpx\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:23 crc kubenswrapper[4837]: I1014 13:04:23.656776 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb913442-9b33-4a88-a111-164879c37512-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:24 crc kubenswrapper[4837]: I1014 13:04:24.502770 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lq7f" Oct 14 13:04:24 crc kubenswrapper[4837]: I1014 13:04:24.532428 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lq7f"] Oct 14 13:04:24 crc kubenswrapper[4837]: I1014 13:04:24.535113 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2lq7f"] Oct 14 13:04:24 crc kubenswrapper[4837]: I1014 13:04:24.798791 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb913442-9b33-4a88-a111-164879c37512" path="/var/lib/kubelet/pods/eb913442-9b33-4a88-a111-164879c37512/volumes" Oct 14 13:04:29 crc kubenswrapper[4837]: I1014 13:04:29.803732 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:04:29 crc kubenswrapper[4837]: I1014 13:04:29.846029 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.140132 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.140810 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.140891 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.141828 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.141938 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052" gracePeriod=600 Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.601843 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052" exitCode=0 Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.601894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052"} Oct 14 13:04:41 crc kubenswrapper[4837]: I1014 13:04:41.602235 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"b54de4ef780d289e7d2d626fe6e6ffa01ff36d90064fef6b767f17c485b0e770"} Oct 14 13:04:49 crc kubenswrapper[4837]: I1014 13:04:49.994670 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n8266"] Oct 14 13:04:49 crc kubenswrapper[4837]: I1014 13:04:49.995596 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n8266" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="registry-server" containerID="cri-o://15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16" gracePeriod=30 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.005612 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p5s4"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.006232 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8p5s4" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="registry-server" containerID="cri-o://39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972" gracePeriod=30 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.024344 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm47m"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.024645 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" podUID="0e168cef-fe99-471f-89db-34290cbb6639" containerName="marketplace-operator" containerID="cri-o://5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705" gracePeriod=30 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.029485 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krz7p"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.029761 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krz7p" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="registry-server" containerID="cri-o://b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc" gracePeriod=30 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.038324 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f27qg"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.038544 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f27qg" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="registry-server" containerID="cri-o://33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d" gracePeriod=30 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.044928 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8tjh"] Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045179 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fceb6b-af85-4aeb-9321-389fe647b886" containerName="pruner" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045193 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fceb6b-af85-4aeb-9321-389fe647b886" containerName="pruner" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045204 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045213 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045228 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045236 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045255 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045264 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045274 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045283 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045297 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045305 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045319 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045328 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045341 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045348 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045361 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045369 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045380 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045388 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045401 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045410 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045421 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd83f994-c75d-4e67-a785-b248a51e8986" containerName="pruner" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045430 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd83f994-c75d-4e67-a785-b248a51e8986" containerName="pruner" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045962 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045976 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="extract-content" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.045989 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.045998 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="extract-utilities" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046119 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fceb6b-af85-4aeb-9321-389fe647b886" containerName="pruner" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046133 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="351b9e6b-6146-435e-89db-484d35087b98" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046149 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd83f994-c75d-4e67-a785-b248a51e8986" containerName="pruner" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046179 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="73944cee-f5d1-491f-bfb5-5663eae0c27b" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046194 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e629bc-e6dd-4cb8-9a73-b3c189d560e4" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046204 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb913442-9b33-4a88-a111-164879c37512" containerName="registry-server" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.046638 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.053294 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8tjh"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.065368 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-8p5s4" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="registry-server" probeResult="failure" output="" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.132059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cde48aeb-8f47-4fde-a2cb-a95c09051e43-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.132175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcznx\" (UniqueName: \"kubernetes.io/projected/cde48aeb-8f47-4fde-a2cb-a95c09051e43-kube-api-access-pcznx\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.132234 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cde48aeb-8f47-4fde-a2cb-a95c09051e43-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.233228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cde48aeb-8f47-4fde-a2cb-a95c09051e43-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.233285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cde48aeb-8f47-4fde-a2cb-a95c09051e43-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.233347 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcznx\" (UniqueName: \"kubernetes.io/projected/cde48aeb-8f47-4fde-a2cb-a95c09051e43-kube-api-access-pcznx\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.235937 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cde48aeb-8f47-4fde-a2cb-a95c09051e43-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.239726 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cde48aeb-8f47-4fde-a2cb-a95c09051e43-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.253467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcznx\" (UniqueName: \"kubernetes.io/projected/cde48aeb-8f47-4fde-a2cb-a95c09051e43-kube-api-access-pcznx\") pod \"marketplace-operator-79b997595-t8tjh\" (UID: \"cde48aeb-8f47-4fde-a2cb-a95c09051e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.370464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.511131 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.518817 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.522407 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.528700 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.556822 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.562280 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8tjh"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.672855 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-catalog-content\") pod \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.672909 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2hjx\" (UniqueName: \"kubernetes.io/projected/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-kube-api-access-w2hjx\") pod \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.672935 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbzk7\" (UniqueName: \"kubernetes.io/projected/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-kube-api-access-dbzk7\") pod \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.672952 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-utilities\") pod \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673000 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-trusted-ca\") pod \"0e168cef-fe99-471f-89db-34290cbb6639\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673015 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-catalog-content\") pod \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\" (UID: \"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673032 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsmg8\" (UniqueName: \"kubernetes.io/projected/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-kube-api-access-vsmg8\") pod \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673062 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-operator-metrics\") pod \"0e168cef-fe99-471f-89db-34290cbb6639\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673082 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-catalog-content\") pod \"49e47458-6044-4966-a0e5-3a8e243414f8\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673099 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft4rf\" (UniqueName: \"kubernetes.io/projected/0e168cef-fe99-471f-89db-34290cbb6639-kube-api-access-ft4rf\") pod \"0e168cef-fe99-471f-89db-34290cbb6639\" (UID: \"0e168cef-fe99-471f-89db-34290cbb6639\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dqtm\" (UniqueName: \"kubernetes.io/projected/49e47458-6044-4966-a0e5-3a8e243414f8-kube-api-access-8dqtm\") pod \"49e47458-6044-4966-a0e5-3a8e243414f8\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673138 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-utilities\") pod \"49e47458-6044-4966-a0e5-3a8e243414f8\" (UID: \"49e47458-6044-4966-a0e5-3a8e243414f8\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673174 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-catalog-content\") pod \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673215 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-utilities\") pod \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\" (UID: \"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.673242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-utilities\") pod \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\" (UID: \"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35\") " Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.674311 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-utilities" (OuterVolumeSpecName: "utilities") pod "020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" (UID: "020802bf-a0e7-44c0-b0d0-0d1f7d66eb35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.675098 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-utilities" (OuterVolumeSpecName: "utilities") pod "49e47458-6044-4966-a0e5-3a8e243414f8" (UID: "49e47458-6044-4966-a0e5-3a8e243414f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.676871 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-utilities" (OuterVolumeSpecName: "utilities") pod "fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" (UID: "fbc7d01d-bef7-470d-a2ea-15f0a8f954d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.676867 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-utilities" (OuterVolumeSpecName: "utilities") pod "e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" (UID: "e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.677126 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0e168cef-fe99-471f-89db-34290cbb6639" (UID: "0e168cef-fe99-471f-89db-34290cbb6639"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.694347 4837 generic.go:334] "Generic (PLEG): container finished" podID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerID="33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d" exitCode=0 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.694610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerDied","Data":"33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.694639 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f27qg" event={"ID":"020802bf-a0e7-44c0-b0d0-0d1f7d66eb35","Type":"ContainerDied","Data":"3b61c56401fcd63334adfd9fa78dd6acf25b953ea4a3524b7df9562b9b7e1d7c"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.694654 4837 scope.go:117] "RemoveContainer" containerID="33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.694765 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f27qg" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.697896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" event={"ID":"cde48aeb-8f47-4fde-a2cb-a95c09051e43","Type":"ContainerStarted","Data":"ed76d4e81c1d305b59a9b0b59275f573166d4458ef91109570bb008b2a2b8776"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.707827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-kube-api-access-dbzk7" (OuterVolumeSpecName: "kube-api-access-dbzk7") pod "fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" (UID: "fbc7d01d-bef7-470d-a2ea-15f0a8f954d9"). InnerVolumeSpecName "kube-api-access-dbzk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.707921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0e168cef-fe99-471f-89db-34290cbb6639" (UID: "0e168cef-fe99-471f-89db-34290cbb6639"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.707973 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e168cef-fe99-471f-89db-34290cbb6639-kube-api-access-ft4rf" (OuterVolumeSpecName: "kube-api-access-ft4rf") pod "0e168cef-fe99-471f-89db-34290cbb6639" (UID: "0e168cef-fe99-471f-89db-34290cbb6639"). InnerVolumeSpecName "kube-api-access-ft4rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.708025 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-kube-api-access-vsmg8" (OuterVolumeSpecName: "kube-api-access-vsmg8") pod "020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" (UID: "020802bf-a0e7-44c0-b0d0-0d1f7d66eb35"). InnerVolumeSpecName "kube-api-access-vsmg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.708086 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e47458-6044-4966-a0e5-3a8e243414f8-kube-api-access-8dqtm" (OuterVolumeSpecName: "kube-api-access-8dqtm") pod "49e47458-6044-4966-a0e5-3a8e243414f8" (UID: "49e47458-6044-4966-a0e5-3a8e243414f8"). InnerVolumeSpecName "kube-api-access-8dqtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.708442 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-kube-api-access-w2hjx" (OuterVolumeSpecName: "kube-api-access-w2hjx") pod "e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" (UID: "e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81"). InnerVolumeSpecName "kube-api-access-w2hjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.715447 4837 generic.go:334] "Generic (PLEG): container finished" podID="49e47458-6044-4966-a0e5-3a8e243414f8" containerID="15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16" exitCode=0 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.715554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerDied","Data":"15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.715592 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n8266" event={"ID":"49e47458-6044-4966-a0e5-3a8e243414f8","Type":"ContainerDied","Data":"710e6da397c983c3c704f236eabe147caf23fb20e1fe8d8122319f76571e15f9"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.715680 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n8266" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.730805 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerID="b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc" exitCode=0 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.731017 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krz7p" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.742666 4837 scope.go:117] "RemoveContainer" containerID="eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.728753 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8v59z"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.743795 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krz7p" event={"ID":"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81","Type":"ContainerDied","Data":"b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.743831 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krz7p" event={"ID":"e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81","Type":"ContainerDied","Data":"dd0934477a5b0ad67d23adcab9efc4069d4b6f9d4766c4d4280b1e0cc28a2fa7"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.758415 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49e47458-6044-4966-a0e5-3a8e243414f8" (UID: "49e47458-6044-4966-a0e5-3a8e243414f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.763072 4837 generic.go:334] "Generic (PLEG): container finished" podID="0e168cef-fe99-471f-89db-34290cbb6639" containerID="5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705" exitCode=0 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.763179 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" event={"ID":"0e168cef-fe99-471f-89db-34290cbb6639","Type":"ContainerDied","Data":"5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.763205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" event={"ID":"0e168cef-fe99-471f-89db-34290cbb6639","Type":"ContainerDied","Data":"cce46d4401e5535d1caf93c71d62c0004258f57157c8d90a473890d12e65f4fa"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.763261 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sm47m" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.771171 4837 generic.go:334] "Generic (PLEG): container finished" podID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerID="39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972" exitCode=0 Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.771210 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p5s4" event={"ID":"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9","Type":"ContainerDied","Data":"39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.771234 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p5s4" event={"ID":"fbc7d01d-bef7-470d-a2ea-15f0a8f954d9","Type":"ContainerDied","Data":"16f5d2bac1ad0b2509a9ed9b4bbdcd66181a3998016196bdba8e30b737f83edf"} Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.771311 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p5s4" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.773181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" (UID: "fbc7d01d-bef7-470d-a2ea-15f0a8f954d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775655 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbzk7\" (UniqueName: \"kubernetes.io/projected/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-kube-api-access-dbzk7\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775692 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775703 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775721 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsmg8\" (UniqueName: \"kubernetes.io/projected/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-kube-api-access-vsmg8\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775742 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e168cef-fe99-471f-89db-34290cbb6639-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775751 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775759 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft4rf\" (UniqueName: \"kubernetes.io/projected/0e168cef-fe99-471f-89db-34290cbb6639-kube-api-access-ft4rf\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775767 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dqtm\" (UniqueName: \"kubernetes.io/projected/49e47458-6044-4966-a0e5-3a8e243414f8-kube-api-access-8dqtm\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775776 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e47458-6044-4966-a0e5-3a8e243414f8-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775785 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775794 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775802 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.775811 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2hjx\" (UniqueName: \"kubernetes.io/projected/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-kube-api-access-w2hjx\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.798317 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" (UID: "e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.819580 4837 scope.go:117] "RemoveContainer" containerID="c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.845916 4837 scope.go:117] "RemoveContainer" containerID="33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.846206 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d\": container with ID starting with 33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d not found: ID does not exist" containerID="33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.846260 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d"} err="failed to get container status \"33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d\": rpc error: code = NotFound desc = could not find container \"33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d\": container with ID starting with 33ee9c0b56ea96f58f0894a59486dfa1bf3c1c79b733db75444d8ea5e0d2731d not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.846283 4837 scope.go:117] "RemoveContainer" containerID="eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.846710 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f\": container with ID starting with eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f not found: ID does not exist" containerID="eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.846768 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f"} err="failed to get container status \"eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f\": rpc error: code = NotFound desc = could not find container \"eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f\": container with ID starting with eae02fbe676b8fa526e44bb701ef45ab1fe034881d9940d7c3208f275007a13f not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.846809 4837 scope.go:117] "RemoveContainer" containerID="c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.847110 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e\": container with ID starting with c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e not found: ID does not exist" containerID="c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.847144 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e"} err="failed to get container status \"c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e\": rpc error: code = NotFound desc = could not find container \"c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e\": container with ID starting with c6bc5568b68e739a1b3fc4430d9f134eeaf24ddbea514ed6d8128cd486c3157e not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.847165 4837 scope.go:117] "RemoveContainer" containerID="15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.862553 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" (UID: "020802bf-a0e7-44c0-b0d0-0d1f7d66eb35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.863128 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm47m"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.877414 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.877457 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.878598 4837 scope.go:117] "RemoveContainer" containerID="b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.881178 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sm47m"] Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.907355 4837 scope.go:117] "RemoveContainer" containerID="9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.923235 4837 scope.go:117] "RemoveContainer" containerID="15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.923560 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16\": container with ID starting with 15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16 not found: ID does not exist" containerID="15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.923603 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16"} err="failed to get container status \"15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16\": rpc error: code = NotFound desc = could not find container \"15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16\": container with ID starting with 15bf4783c52a3c84e96404869965e1efe3f4d4117b3fd2d116fa0286c8b61b16 not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.923630 4837 scope.go:117] "RemoveContainer" containerID="b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.923898 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f\": container with ID starting with b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f not found: ID does not exist" containerID="b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.923925 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f"} err="failed to get container status \"b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f\": rpc error: code = NotFound desc = could not find container \"b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f\": container with ID starting with b32b31dcb49ceb7fbec0b603b2fca8f4643207dbd8e9e2bacb0c62315f46136f not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.923940 4837 scope.go:117] "RemoveContainer" containerID="9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.924154 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066\": container with ID starting with 9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066 not found: ID does not exist" containerID="9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.924209 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066"} err="failed to get container status \"9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066\": rpc error: code = NotFound desc = could not find container \"9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066\": container with ID starting with 9cf0e9c70923a84985acdb34cf30c6cb223d6cda54137646681abf2e55b06066 not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.924223 4837 scope.go:117] "RemoveContainer" containerID="b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.934188 4837 scope.go:117] "RemoveContainer" containerID="dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.949847 4837 scope.go:117] "RemoveContainer" containerID="e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.966744 4837 scope.go:117] "RemoveContainer" containerID="b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.967197 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc\": container with ID starting with b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc not found: ID does not exist" containerID="b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.967234 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc"} err="failed to get container status \"b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc\": rpc error: code = NotFound desc = could not find container \"b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc\": container with ID starting with b4b77594eaa0d32b8b5fd80c11bb733a23b440e1c9955cb8778cb9d8e0d23edc not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.967262 4837 scope.go:117] "RemoveContainer" containerID="dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.967644 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9\": container with ID starting with dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9 not found: ID does not exist" containerID="dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.967664 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9"} err="failed to get container status \"dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9\": rpc error: code = NotFound desc = could not find container \"dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9\": container with ID starting with dc81dcb4aa45f88a9464f50f3a744ec9b832ac9da3f0e06db10fed8f9a90bfd9 not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.967677 4837 scope.go:117] "RemoveContainer" containerID="e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.968009 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5\": container with ID starting with e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5 not found: ID does not exist" containerID="e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.968054 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5"} err="failed to get container status \"e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5\": rpc error: code = NotFound desc = could not find container \"e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5\": container with ID starting with e9cc33c0d54e30777dec067da981b47e0abb11047e825fc852078776a98356d5 not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.968082 4837 scope.go:117] "RemoveContainer" containerID="5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.979406 4837 scope.go:117] "RemoveContainer" containerID="5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705" Oct 14 13:04:50 crc kubenswrapper[4837]: E1014 13:04:50.979726 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705\": container with ID starting with 5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705 not found: ID does not exist" containerID="5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.979761 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705"} err="failed to get container status \"5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705\": rpc error: code = NotFound desc = could not find container \"5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705\": container with ID starting with 5f7812e5b3079b10ea4a27bc26180ddf8ae34224b74fbbcf0729de28e5ba4705 not found: ID does not exist" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.979786 4837 scope.go:117] "RemoveContainer" containerID="39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972" Oct 14 13:04:50 crc kubenswrapper[4837]: I1014 13:04:50.994057 4837 scope.go:117] "RemoveContainer" containerID="e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.010388 4837 scope.go:117] "RemoveContainer" containerID="a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.017287 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f27qg"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.022500 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f27qg"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.029309 4837 scope.go:117] "RemoveContainer" containerID="39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972" Oct 14 13:04:51 crc kubenswrapper[4837]: E1014 13:04:51.029707 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972\": container with ID starting with 39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972 not found: ID does not exist" containerID="39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.029740 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972"} err="failed to get container status \"39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972\": rpc error: code = NotFound desc = could not find container \"39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972\": container with ID starting with 39608f5fb4f41f105e8906d7d1c6e34c9fcb3ab9ad4f8dd3ad4182ffd2d11972 not found: ID does not exist" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.029765 4837 scope.go:117] "RemoveContainer" containerID="e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b" Oct 14 13:04:51 crc kubenswrapper[4837]: E1014 13:04:51.030314 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b\": container with ID starting with e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b not found: ID does not exist" containerID="e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.030344 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b"} err="failed to get container status \"e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b\": rpc error: code = NotFound desc = could not find container \"e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b\": container with ID starting with e632f5d4f1ea8072443ade9362cf334db9990b30f4a92ed28a300613c519db6b not found: ID does not exist" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.030369 4837 scope.go:117] "RemoveContainer" containerID="a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db" Oct 14 13:04:51 crc kubenswrapper[4837]: E1014 13:04:51.030864 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db\": container with ID starting with a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db not found: ID does not exist" containerID="a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.030912 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db"} err="failed to get container status \"a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db\": rpc error: code = NotFound desc = could not find container \"a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db\": container with ID starting with a2e42d27987a44b910529f845d0576ede14100f893754918e981299560c069db not found: ID does not exist" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.037200 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n8266"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.039800 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n8266"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.044475 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krz7p"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.049741 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krz7p"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.091263 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p5s4"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.095295 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8p5s4"] Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.784828 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" event={"ID":"cde48aeb-8f47-4fde-a2cb-a95c09051e43","Type":"ContainerStarted","Data":"efd601e9fba9eab97da4b1570b8b7dabb6d637cdab24e3df5599fafdfc16953c"} Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.785392 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.788057 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" Oct 14 13:04:51 crc kubenswrapper[4837]: I1014 13:04:51.819620 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t8tjh" podStartSLOduration=1.819605502 podStartE2EDuration="1.819605502s" podCreationTimestamp="2025-10-14 13:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:04:51.799869828 +0000 UTC m=+229.716869641" watchObservedRunningTime="2025-10-14 13:04:51.819605502 +0000 UTC m=+229.736605315" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.004764 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rbmpr"] Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.004998 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005011 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005028 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005036 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005048 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005056 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005068 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005076 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005089 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e168cef-fe99-471f-89db-34290cbb6639" containerName="marketplace-operator" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005097 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e168cef-fe99-471f-89db-34290cbb6639" containerName="marketplace-operator" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005112 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005120 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005130 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005139 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005151 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005165 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="extract-content" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005180 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005222 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005234 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005244 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005260 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005270 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005288 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005299 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="extract-utilities" Oct 14 13:04:52 crc kubenswrapper[4837]: E1014 13:04:52.005311 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005321 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005440 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005452 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e168cef-fe99-471f-89db-34290cbb6639" containerName="marketplace-operator" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005466 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005477 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.005488 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" containerName="registry-server" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.006484 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.008325 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.023883 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbmpr"] Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.090620 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f027db-f79f-4187-82df-c6d63c37ffce-utilities\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.090665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f027db-f79f-4187-82df-c6d63c37ffce-catalog-content\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.090857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmcmt\" (UniqueName: \"kubernetes.io/projected/28f027db-f79f-4187-82df-c6d63c37ffce-kube-api-access-jmcmt\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.191575 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmcmt\" (UniqueName: \"kubernetes.io/projected/28f027db-f79f-4187-82df-c6d63c37ffce-kube-api-access-jmcmt\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.191649 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f027db-f79f-4187-82df-c6d63c37ffce-utilities\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.191668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f027db-f79f-4187-82df-c6d63c37ffce-catalog-content\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.192098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f027db-f79f-4187-82df-c6d63c37ffce-catalog-content\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.192311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f027db-f79f-4187-82df-c6d63c37ffce-utilities\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.219397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmcmt\" (UniqueName: \"kubernetes.io/projected/28f027db-f79f-4187-82df-c6d63c37ffce-kube-api-access-jmcmt\") pod \"certified-operators-rbmpr\" (UID: \"28f027db-f79f-4187-82df-c6d63c37ffce\") " pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.330231 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.518004 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbmpr"] Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.608036 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b2ll4"] Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.611233 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.614319 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.622226 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2ll4"] Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.697044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61e9a58b-2860-421e-b616-4ca4234a1e24-utilities\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.697232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7q8\" (UniqueName: \"kubernetes.io/projected/61e9a58b-2860-421e-b616-4ca4234a1e24-kube-api-access-pq7q8\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.697272 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61e9a58b-2860-421e-b616-4ca4234a1e24-catalog-content\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.794110 4837 generic.go:334] "Generic (PLEG): container finished" podID="28f027db-f79f-4187-82df-c6d63c37ffce" containerID="d661bdd0174f57d2d58a4cf8e50a55a1c03cc8eba74ef147cb6dc94cc344b420" exitCode=0 Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.794456 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020802bf-a0e7-44c0-b0d0-0d1f7d66eb35" path="/var/lib/kubelet/pods/020802bf-a0e7-44c0-b0d0-0d1f7d66eb35/volumes" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.796214 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e168cef-fe99-471f-89db-34290cbb6639" path="/var/lib/kubelet/pods/0e168cef-fe99-471f-89db-34290cbb6639/volumes" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.796833 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e47458-6044-4966-a0e5-3a8e243414f8" path="/var/lib/kubelet/pods/49e47458-6044-4966-a0e5-3a8e243414f8/volumes" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.798129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7q8\" (UniqueName: \"kubernetes.io/projected/61e9a58b-2860-421e-b616-4ca4234a1e24-kube-api-access-pq7q8\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.798214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61e9a58b-2860-421e-b616-4ca4234a1e24-catalog-content\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.798249 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61e9a58b-2860-421e-b616-4ca4234a1e24-utilities\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.800067 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81" path="/var/lib/kubelet/pods/e1e1174c-efa8-4ccb-8a8f-2ed2c1b32e81/volumes" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.801196 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc7d01d-bef7-470d-a2ea-15f0a8f954d9" path="/var/lib/kubelet/pods/fbc7d01d-bef7-470d-a2ea-15f0a8f954d9/volumes" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.801719 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61e9a58b-2860-421e-b616-4ca4234a1e24-utilities\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.802053 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61e9a58b-2860-421e-b616-4ca4234a1e24-catalog-content\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.802399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbmpr" event={"ID":"28f027db-f79f-4187-82df-c6d63c37ffce","Type":"ContainerDied","Data":"d661bdd0174f57d2d58a4cf8e50a55a1c03cc8eba74ef147cb6dc94cc344b420"} Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.802451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbmpr" event={"ID":"28f027db-f79f-4187-82df-c6d63c37ffce","Type":"ContainerStarted","Data":"4d6ea8e8f675308dd0e574d763093e7389f529d861378740f8e9888c32ec08cb"} Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.823483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7q8\" (UniqueName: \"kubernetes.io/projected/61e9a58b-2860-421e-b616-4ca4234a1e24-kube-api-access-pq7q8\") pod \"redhat-marketplace-b2ll4\" (UID: \"61e9a58b-2860-421e-b616-4ca4234a1e24\") " pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:52 crc kubenswrapper[4837]: I1014 13:04:52.982406 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:04:53 crc kubenswrapper[4837]: I1014 13:04:53.176847 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2ll4"] Oct 14 13:04:53 crc kubenswrapper[4837]: W1014 13:04:53.183136 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e9a58b_2860_421e_b616_4ca4234a1e24.slice/crio-38827aed16560af5102eeca041f8f0857513c972f8764eb17dbb6f60f639b359 WatchSource:0}: Error finding container 38827aed16560af5102eeca041f8f0857513c972f8764eb17dbb6f60f639b359: Status 404 returned error can't find the container with id 38827aed16560af5102eeca041f8f0857513c972f8764eb17dbb6f60f639b359 Oct 14 13:04:53 crc kubenswrapper[4837]: I1014 13:04:53.802065 4837 generic.go:334] "Generic (PLEG): container finished" podID="28f027db-f79f-4187-82df-c6d63c37ffce" containerID="2f6a331cdba5e69f564fe1e6a0f885e136328c3c5df6d7a38d8eae421724be35" exitCode=0 Oct 14 13:04:53 crc kubenswrapper[4837]: I1014 13:04:53.802131 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbmpr" event={"ID":"28f027db-f79f-4187-82df-c6d63c37ffce","Type":"ContainerDied","Data":"2f6a331cdba5e69f564fe1e6a0f885e136328c3c5df6d7a38d8eae421724be35"} Oct 14 13:04:53 crc kubenswrapper[4837]: I1014 13:04:53.804786 4837 generic.go:334] "Generic (PLEG): container finished" podID="61e9a58b-2860-421e-b616-4ca4234a1e24" containerID="35870745674e00f65c53f842e68b029c1ed054447287ff0f44a723fa17e106ed" exitCode=0 Oct 14 13:04:53 crc kubenswrapper[4837]: I1014 13:04:53.804839 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2ll4" event={"ID":"61e9a58b-2860-421e-b616-4ca4234a1e24","Type":"ContainerDied","Data":"35870745674e00f65c53f842e68b029c1ed054447287ff0f44a723fa17e106ed"} Oct 14 13:04:53 crc kubenswrapper[4837]: I1014 13:04:53.804897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2ll4" event={"ID":"61e9a58b-2860-421e-b616-4ca4234a1e24","Type":"ContainerStarted","Data":"38827aed16560af5102eeca041f8f0857513c972f8764eb17dbb6f60f639b359"} Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.407748 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9bvw9"] Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.409631 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.411834 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.417500 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bvw9"] Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.521491 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd69629-bfc4-405d-8764-a2082b5c8449-catalog-content\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.521605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx775\" (UniqueName: \"kubernetes.io/projected/3bd69629-bfc4-405d-8764-a2082b5c8449-kube-api-access-nx775\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.521670 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd69629-bfc4-405d-8764-a2082b5c8449-utilities\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.623823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx775\" (UniqueName: \"kubernetes.io/projected/3bd69629-bfc4-405d-8764-a2082b5c8449-kube-api-access-nx775\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.623894 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd69629-bfc4-405d-8764-a2082b5c8449-utilities\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.623961 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd69629-bfc4-405d-8764-a2082b5c8449-catalog-content\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.624615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd69629-bfc4-405d-8764-a2082b5c8449-utilities\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.624608 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd69629-bfc4-405d-8764-a2082b5c8449-catalog-content\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.661631 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx775\" (UniqueName: \"kubernetes.io/projected/3bd69629-bfc4-405d-8764-a2082b5c8449-kube-api-access-nx775\") pod \"redhat-operators-9bvw9\" (UID: \"3bd69629-bfc4-405d-8764-a2082b5c8449\") " pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.752588 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.819529 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbmpr" event={"ID":"28f027db-f79f-4187-82df-c6d63c37ffce","Type":"ContainerStarted","Data":"b5c2d84cee510a861b88c66420da1ea33b1bc82cb36d9fa9a8d04291cae3e287"} Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.822833 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2ll4" event={"ID":"61e9a58b-2860-421e-b616-4ca4234a1e24","Type":"ContainerDied","Data":"e8cad7c9c6b41d5f8cf6952d2e2420a6f98a46a8201772cb35378ebe87755395"} Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.822305 4837 generic.go:334] "Generic (PLEG): container finished" podID="61e9a58b-2860-421e-b616-4ca4234a1e24" containerID="e8cad7c9c6b41d5f8cf6952d2e2420a6f98a46a8201772cb35378ebe87755395" exitCode=0 Oct 14 13:04:54 crc kubenswrapper[4837]: I1014 13:04:54.842137 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rbmpr" podStartSLOduration=2.362184682 podStartE2EDuration="3.842110375s" podCreationTimestamp="2025-10-14 13:04:51 +0000 UTC" firstStartedPulling="2025-10-14 13:04:52.797009411 +0000 UTC m=+230.714009234" lastFinishedPulling="2025-10-14 13:04:54.276935074 +0000 UTC m=+232.193934927" observedRunningTime="2025-10-14 13:04:54.839576526 +0000 UTC m=+232.756576339" watchObservedRunningTime="2025-10-14 13:04:54.842110375 +0000 UTC m=+232.759110198" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.002873 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hf72l"] Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.005041 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.010882 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.013562 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hf72l"] Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.130983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25h2v\" (UniqueName: \"kubernetes.io/projected/5484dc7a-db10-484c-94e5-faae9179b8bc-kube-api-access-25h2v\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.131055 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5484dc7a-db10-484c-94e5-faae9179b8bc-catalog-content\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.131142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5484dc7a-db10-484c-94e5-faae9179b8bc-utilities\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.190774 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bvw9"] Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.232599 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5484dc7a-db10-484c-94e5-faae9179b8bc-utilities\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.233694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25h2v\" (UniqueName: \"kubernetes.io/projected/5484dc7a-db10-484c-94e5-faae9179b8bc-kube-api-access-25h2v\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.233734 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5484dc7a-db10-484c-94e5-faae9179b8bc-catalog-content\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.234069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5484dc7a-db10-484c-94e5-faae9179b8bc-catalog-content\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.233585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5484dc7a-db10-484c-94e5-faae9179b8bc-utilities\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.251734 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25h2v\" (UniqueName: \"kubernetes.io/projected/5484dc7a-db10-484c-94e5-faae9179b8bc-kube-api-access-25h2v\") pod \"community-operators-hf72l\" (UID: \"5484dc7a-db10-484c-94e5-faae9179b8bc\") " pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.337558 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.538949 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hf72l"] Oct 14 13:04:55 crc kubenswrapper[4837]: W1014 13:04:55.542483 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5484dc7a_db10_484c_94e5_faae9179b8bc.slice/crio-74b9852ae8290f9f0d7be093986f797b213cae175a21a446d608ad5eb8e27df0 WatchSource:0}: Error finding container 74b9852ae8290f9f0d7be093986f797b213cae175a21a446d608ad5eb8e27df0: Status 404 returned error can't find the container with id 74b9852ae8290f9f0d7be093986f797b213cae175a21a446d608ad5eb8e27df0 Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.829770 4837 generic.go:334] "Generic (PLEG): container finished" podID="5484dc7a-db10-484c-94e5-faae9179b8bc" containerID="f8a357445ee4667f10d8a7791bfcac1f9d58ff00f5dd546665de06c3480f94ea" exitCode=0 Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.829893 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hf72l" event={"ID":"5484dc7a-db10-484c-94e5-faae9179b8bc","Type":"ContainerDied","Data":"f8a357445ee4667f10d8a7791bfcac1f9d58ff00f5dd546665de06c3480f94ea"} Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.830160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hf72l" event={"ID":"5484dc7a-db10-484c-94e5-faae9179b8bc","Type":"ContainerStarted","Data":"74b9852ae8290f9f0d7be093986f797b213cae175a21a446d608ad5eb8e27df0"} Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.831751 4837 generic.go:334] "Generic (PLEG): container finished" podID="3bd69629-bfc4-405d-8764-a2082b5c8449" containerID="e4121221246b8092c9c3a7a3599b2424843d8b12c9f33b3e3a8850fb18d17f45" exitCode=0 Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.831821 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bvw9" event={"ID":"3bd69629-bfc4-405d-8764-a2082b5c8449","Type":"ContainerDied","Data":"e4121221246b8092c9c3a7a3599b2424843d8b12c9f33b3e3a8850fb18d17f45"} Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.831860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bvw9" event={"ID":"3bd69629-bfc4-405d-8764-a2082b5c8449","Type":"ContainerStarted","Data":"59c2852c384f21c8131a09995b2b30cd04443394ce236f094189ef3497425110"} Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.834595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2ll4" event={"ID":"61e9a58b-2860-421e-b616-4ca4234a1e24","Type":"ContainerStarted","Data":"c5f08b8fde7df942c84f38ff1bab7ebeb251951c275bea5ae7f32337059e7736"} Oct 14 13:04:55 crc kubenswrapper[4837]: I1014 13:04:55.894015 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b2ll4" podStartSLOduration=2.165433764 podStartE2EDuration="3.893995422s" podCreationTimestamp="2025-10-14 13:04:52 +0000 UTC" firstStartedPulling="2025-10-14 13:04:53.806268244 +0000 UTC m=+231.723268087" lastFinishedPulling="2025-10-14 13:04:55.534829932 +0000 UTC m=+233.451829745" observedRunningTime="2025-10-14 13:04:55.892319166 +0000 UTC m=+233.809318989" watchObservedRunningTime="2025-10-14 13:04:55.893995422 +0000 UTC m=+233.810995235" Oct 14 13:04:57 crc kubenswrapper[4837]: I1014 13:04:57.844886 4837 generic.go:334] "Generic (PLEG): container finished" podID="5484dc7a-db10-484c-94e5-faae9179b8bc" containerID="b929a7615018176567e7bc45cd1f09fac13533b0bc7dbb250eeeef9efcc61051" exitCode=0 Oct 14 13:04:57 crc kubenswrapper[4837]: I1014 13:04:57.845108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hf72l" event={"ID":"5484dc7a-db10-484c-94e5-faae9179b8bc","Type":"ContainerDied","Data":"b929a7615018176567e7bc45cd1f09fac13533b0bc7dbb250eeeef9efcc61051"} Oct 14 13:04:57 crc kubenswrapper[4837]: I1014 13:04:57.847942 4837 generic.go:334] "Generic (PLEG): container finished" podID="3bd69629-bfc4-405d-8764-a2082b5c8449" containerID="3686f8c812e92778ad936fb2bc6ac7830133923e7cdd2684c9b10e7c87ed3cf5" exitCode=0 Oct 14 13:04:57 crc kubenswrapper[4837]: I1014 13:04:57.847988 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bvw9" event={"ID":"3bd69629-bfc4-405d-8764-a2082b5c8449","Type":"ContainerDied","Data":"3686f8c812e92778ad936fb2bc6ac7830133923e7cdd2684c9b10e7c87ed3cf5"} Oct 14 13:04:59 crc kubenswrapper[4837]: I1014 13:04:59.861282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bvw9" event={"ID":"3bd69629-bfc4-405d-8764-a2082b5c8449","Type":"ContainerStarted","Data":"78b2480d92bb9e4fbc9c40128216cb855269905a7da1c0c3fb4d92117304d66a"} Oct 14 13:04:59 crc kubenswrapper[4837]: I1014 13:04:59.864881 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hf72l" event={"ID":"5484dc7a-db10-484c-94e5-faae9179b8bc","Type":"ContainerStarted","Data":"2a721b9e2972c0448292438f148c8e95765e210e02ad575cbf531975d83173f2"} Oct 14 13:04:59 crc kubenswrapper[4837]: I1014 13:04:59.883114 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9bvw9" podStartSLOduration=3.114894056 podStartE2EDuration="5.88309592s" podCreationTimestamp="2025-10-14 13:04:54 +0000 UTC" firstStartedPulling="2025-10-14 13:04:55.834086248 +0000 UTC m=+233.751086071" lastFinishedPulling="2025-10-14 13:04:58.602288122 +0000 UTC m=+236.519287935" observedRunningTime="2025-10-14 13:04:59.879865232 +0000 UTC m=+237.796865095" watchObservedRunningTime="2025-10-14 13:04:59.88309592 +0000 UTC m=+237.800095723" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.330839 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.330934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.388264 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.407621 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hf72l" podStartSLOduration=5.867223869 podStartE2EDuration="8.40759982s" podCreationTimestamp="2025-10-14 13:04:54 +0000 UTC" firstStartedPulling="2025-10-14 13:04:55.832046703 +0000 UTC m=+233.749046516" lastFinishedPulling="2025-10-14 13:04:58.372422624 +0000 UTC m=+236.289422467" observedRunningTime="2025-10-14 13:04:59.905919128 +0000 UTC m=+237.822918941" watchObservedRunningTime="2025-10-14 13:05:02.40759982 +0000 UTC m=+240.324599673" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.922281 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rbmpr" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.994980 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:05:02 crc kubenswrapper[4837]: I1014 13:05:02.995484 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:05:03 crc kubenswrapper[4837]: I1014 13:05:03.041707 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:05:03 crc kubenswrapper[4837]: I1014 13:05:03.933601 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b2ll4" Oct 14 13:05:04 crc kubenswrapper[4837]: I1014 13:05:04.752943 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:05:04 crc kubenswrapper[4837]: I1014 13:05:04.753029 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:05:04 crc kubenswrapper[4837]: I1014 13:05:04.798467 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:05:04 crc kubenswrapper[4837]: I1014 13:05:04.950948 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9bvw9" Oct 14 13:05:05 crc kubenswrapper[4837]: I1014 13:05:05.338386 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:05:05 crc kubenswrapper[4837]: I1014 13:05:05.338484 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:05:05 crc kubenswrapper[4837]: I1014 13:05:05.395460 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:05:05 crc kubenswrapper[4837]: I1014 13:05:05.933366 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hf72l" Oct 14 13:05:15 crc kubenswrapper[4837]: I1014 13:05:15.777226 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" podUID="64edb413-91a3-48ab-8d24-131c2d4fecb7" containerName="oauth-openshift" containerID="cri-o://3c4a325735ed6f450667b69e99b26cdaf05c057ef1dedb91580f7f48ce7f4615" gracePeriod=15 Oct 14 13:05:15 crc kubenswrapper[4837]: I1014 13:05:15.946072 4837 generic.go:334] "Generic (PLEG): container finished" podID="64edb413-91a3-48ab-8d24-131c2d4fecb7" containerID="3c4a325735ed6f450667b69e99b26cdaf05c057ef1dedb91580f7f48ce7f4615" exitCode=0 Oct 14 13:05:15 crc kubenswrapper[4837]: I1014 13:05:15.946173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" event={"ID":"64edb413-91a3-48ab-8d24-131c2d4fecb7","Type":"ContainerDied","Data":"3c4a325735ed6f450667b69e99b26cdaf05c057ef1dedb91580f7f48ce7f4615"} Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.134243 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.160717 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt"] Oct 14 13:05:16 crc kubenswrapper[4837]: E1014 13:05:16.160958 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64edb413-91a3-48ab-8d24-131c2d4fecb7" containerName="oauth-openshift" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.160972 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="64edb413-91a3-48ab-8d24-131c2d4fecb7" containerName="oauth-openshift" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.161082 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="64edb413-91a3-48ab-8d24-131c2d4fecb7" containerName="oauth-openshift" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.161528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.169939 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt"] Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203026 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-ocp-branding-template\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203077 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-router-certs\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203104 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcwvw\" (UniqueName: \"kubernetes.io/projected/64edb413-91a3-48ab-8d24-131c2d4fecb7-kube-api-access-jcwvw\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203134 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-provider-selection\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-error\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203209 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-policies\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203271 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-serving-cert\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203304 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-trusted-ca-bundle\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203333 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-service-ca\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-session\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203374 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-cliconfig\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203401 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-idp-0-file-data\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203426 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-dir\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.203455 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-login\") pod \"64edb413-91a3-48ab-8d24-131c2d4fecb7\" (UID: \"64edb413-91a3-48ab-8d24-131c2d4fecb7\") " Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.209326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.209336 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.209570 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.211843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.211855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.211925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.212274 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.212365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.212658 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.212811 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.213873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64edb413-91a3-48ab-8d24-131c2d4fecb7-kube-api-access-jcwvw" (OuterVolumeSpecName: "kube-api-access-jcwvw") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "kube-api-access-jcwvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.215203 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.215472 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.217655 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "64edb413-91a3-48ab-8d24-131c2d4fecb7" (UID: "64edb413-91a3-48ab-8d24-131c2d4fecb7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.304833 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.304901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.304986 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305090 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305115 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276lm\" (UniqueName: \"kubernetes.io/projected/05443d35-4230-4b17-962a-8d3755342ca8-kube-api-access-276lm\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305209 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05443d35-4230-4b17-962a-8d3755342ca8-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305237 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305480 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305539 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305608 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305678 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305712 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305807 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305831 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305850 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305871 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305890 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305910 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305931 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305949 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305967 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.305986 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcwvw\" (UniqueName: \"kubernetes.io/projected/64edb413-91a3-48ab-8d24-131c2d4fecb7-kube-api-access-jcwvw\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.306007 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.306024 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.306042 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64edb413-91a3-48ab-8d24-131c2d4fecb7-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.306059 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64edb413-91a3-48ab-8d24-131c2d4fecb7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.407589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276lm\" (UniqueName: \"kubernetes.io/projected/05443d35-4230-4b17-962a-8d3755342ca8-kube-api-access-276lm\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.407717 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.407869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05443d35-4230-4b17-962a-8d3755342ca8-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.407938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408020 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408231 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408460 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408517 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.408623 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.410417 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05443d35-4230-4b17-962a-8d3755342ca8-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.411511 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.412211 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.412451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.415510 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.416649 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.416832 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.416994 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05443d35-4230-4b17-962a-8d3755342ca8-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.418113 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.418451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.422488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.423365 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.425149 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05443d35-4230-4b17-962a-8d3755342ca8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.446748 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276lm\" (UniqueName: \"kubernetes.io/projected/05443d35-4230-4b17-962a-8d3755342ca8-kube-api-access-276lm\") pod \"oauth-openshift-9bc7b6b6b-m6xrt\" (UID: \"05443d35-4230-4b17-962a-8d3755342ca8\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.486775 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.953734 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" event={"ID":"64edb413-91a3-48ab-8d24-131c2d4fecb7","Type":"ContainerDied","Data":"df80ac6ea36ab7dce44d6baa44385366979676040858bf8aec25ad4cf03474e3"} Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.953818 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8v59z" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.954085 4837 scope.go:117] "RemoveContainer" containerID="3c4a325735ed6f450667b69e99b26cdaf05c057ef1dedb91580f7f48ce7f4615" Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.976492 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8v59z"] Oct 14 13:05:16 crc kubenswrapper[4837]: I1014 13:05:16.980797 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8v59z"] Oct 14 13:05:17 crc kubenswrapper[4837]: I1014 13:05:17.002572 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt"] Oct 14 13:05:17 crc kubenswrapper[4837]: W1014 13:05:17.021372 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05443d35_4230_4b17_962a_8d3755342ca8.slice/crio-0a72354baefa2be5a90e14196247f6c0bc149614af5a6188c4b097855778b922 WatchSource:0}: Error finding container 0a72354baefa2be5a90e14196247f6c0bc149614af5a6188c4b097855778b922: Status 404 returned error can't find the container with id 0a72354baefa2be5a90e14196247f6c0bc149614af5a6188c4b097855778b922 Oct 14 13:05:17 crc kubenswrapper[4837]: I1014 13:05:17.963689 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" event={"ID":"05443d35-4230-4b17-962a-8d3755342ca8","Type":"ContainerStarted","Data":"5123f2a729f2c10db15a36f07b70c5933524d84823344c1a662b756f08336f06"} Oct 14 13:05:17 crc kubenswrapper[4837]: I1014 13:05:17.964259 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:17 crc kubenswrapper[4837]: I1014 13:05:17.964283 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" event={"ID":"05443d35-4230-4b17-962a-8d3755342ca8","Type":"ContainerStarted","Data":"0a72354baefa2be5a90e14196247f6c0bc149614af5a6188c4b097855778b922"} Oct 14 13:05:17 crc kubenswrapper[4837]: I1014 13:05:17.973215 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" Oct 14 13:05:17 crc kubenswrapper[4837]: I1014 13:05:17.995371 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-m6xrt" podStartSLOduration=27.995350777 podStartE2EDuration="27.995350777s" podCreationTimestamp="2025-10-14 13:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:05:17.994410541 +0000 UTC m=+255.911410454" watchObservedRunningTime="2025-10-14 13:05:17.995350777 +0000 UTC m=+255.912350600" Oct 14 13:05:18 crc kubenswrapper[4837]: I1014 13:05:18.794889 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64edb413-91a3-48ab-8d24-131c2d4fecb7" path="/var/lib/kubelet/pods/64edb413-91a3-48ab-8d24-131c2d4fecb7/volumes" Oct 14 13:06:41 crc kubenswrapper[4837]: I1014 13:06:41.140969 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:06:41 crc kubenswrapper[4837]: I1014 13:06:41.141810 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:07:11 crc kubenswrapper[4837]: I1014 13:07:11.140213 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:07:11 crc kubenswrapper[4837]: I1014 13:07:11.140831 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.140599 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.141448 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.141532 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.143698 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b54de4ef780d289e7d2d626fe6e6ffa01ff36d90064fef6b767f17c485b0e770"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.143873 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://b54de4ef780d289e7d2d626fe6e6ffa01ff36d90064fef6b767f17c485b0e770" gracePeriod=600 Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.880388 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="b54de4ef780d289e7d2d626fe6e6ffa01ff36d90064fef6b767f17c485b0e770" exitCode=0 Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.880498 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"b54de4ef780d289e7d2d626fe6e6ffa01ff36d90064fef6b767f17c485b0e770"} Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.881186 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"c9df650ea9a0889b5303a141ba1c69bbbdcc6bf28d1e1e51c58ad3e80e0c7622"} Oct 14 13:07:41 crc kubenswrapper[4837]: I1014 13:07:41.881237 4837 scope.go:117] "RemoveContainer" containerID="45227c7c028c6f2acb78e996dc2b5fce22ba4ffc43f7c34180b3ceed5786b052" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.510788 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dt68v"] Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.512057 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.534505 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dt68v"] Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd6c2277-acad-4382-b9b3-b69b95017a8d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709150 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd6c2277-acad-4382-b9b3-b69b95017a8d-registry-certificates\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd6c2277-acad-4382-b9b3-b69b95017a8d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709252 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtgr\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-kube-api-access-vvtgr\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709421 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6c2277-acad-4382-b9b3-b69b95017a8d-trusted-ca\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709535 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-bound-sa-token\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709672 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-registry-tls\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.709753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.740743 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.810667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6c2277-acad-4382-b9b3-b69b95017a8d-trusted-ca\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.810748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-bound-sa-token\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.810821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-registry-tls\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.810878 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd6c2277-acad-4382-b9b3-b69b95017a8d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.810930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd6c2277-acad-4382-b9b3-b69b95017a8d-registry-certificates\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.810976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd6c2277-acad-4382-b9b3-b69b95017a8d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.811007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtgr\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-kube-api-access-vvtgr\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.812242 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd6c2277-acad-4382-b9b3-b69b95017a8d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.812642 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6c2277-acad-4382-b9b3-b69b95017a8d-trusted-ca\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.813712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd6c2277-acad-4382-b9b3-b69b95017a8d-registry-certificates\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.818774 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-registry-tls\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.819663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd6c2277-acad-4382-b9b3-b69b95017a8d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.834346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-bound-sa-token\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.835590 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtgr\" (UniqueName: \"kubernetes.io/projected/bd6c2277-acad-4382-b9b3-b69b95017a8d-kube-api-access-vvtgr\") pod \"image-registry-66df7c8f76-dt68v\" (UID: \"bd6c2277-acad-4382-b9b3-b69b95017a8d\") " pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:58 crc kubenswrapper[4837]: I1014 13:08:58.836083 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:59 crc kubenswrapper[4837]: I1014 13:08:59.117117 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dt68v"] Oct 14 13:08:59 crc kubenswrapper[4837]: W1014 13:08:59.126838 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6c2277_acad_4382_b9b3_b69b95017a8d.slice/crio-d3d266cd85306c760169a8f6befafb0f9215cfc32ac7d096ede9e6f640d60c4a WatchSource:0}: Error finding container d3d266cd85306c760169a8f6befafb0f9215cfc32ac7d096ede9e6f640d60c4a: Status 404 returned error can't find the container with id d3d266cd85306c760169a8f6befafb0f9215cfc32ac7d096ede9e6f640d60c4a Oct 14 13:08:59 crc kubenswrapper[4837]: I1014 13:08:59.402212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" event={"ID":"bd6c2277-acad-4382-b9b3-b69b95017a8d","Type":"ContainerStarted","Data":"8e2805e465931ec01e917070ce8e68d6754d8fb03c18ecd30f08263ab8c859fb"} Oct 14 13:08:59 crc kubenswrapper[4837]: I1014 13:08:59.402274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" event={"ID":"bd6c2277-acad-4382-b9b3-b69b95017a8d","Type":"ContainerStarted","Data":"d3d266cd85306c760169a8f6befafb0f9215cfc32ac7d096ede9e6f640d60c4a"} Oct 14 13:08:59 crc kubenswrapper[4837]: I1014 13:08:59.402320 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:08:59 crc kubenswrapper[4837]: I1014 13:08:59.419029 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" podStartSLOduration=1.419005269 podStartE2EDuration="1.419005269s" podCreationTimestamp="2025-10-14 13:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:08:59.418037593 +0000 UTC m=+477.335037416" watchObservedRunningTime="2025-10-14 13:08:59.419005269 +0000 UTC m=+477.336005102" Oct 14 13:09:18 crc kubenswrapper[4837]: I1014 13:09:18.843787 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dt68v" Oct 14 13:09:18 crc kubenswrapper[4837]: I1014 13:09:18.914626 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqs75"] Oct 14 13:09:41 crc kubenswrapper[4837]: I1014 13:09:41.139950 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:09:41 crc kubenswrapper[4837]: I1014 13:09:41.140583 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:09:43 crc kubenswrapper[4837]: I1014 13:09:43.963558 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" podUID="14165edd-b69a-4886-8405-09298571b47b" containerName="registry" containerID="cri-o://b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14" gracePeriod=30 Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.315184 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455485 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-registry-tls\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455556 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-registry-certificates\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455600 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-bound-sa-token\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455642 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14165edd-b69a-4886-8405-09298571b47b-ca-trust-extracted\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455674 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smdnn\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-kube-api-access-smdnn\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455916 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-trusted-ca\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.455961 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14165edd-b69a-4886-8405-09298571b47b-installation-pull-secrets\") pod \"14165edd-b69a-4886-8405-09298571b47b\" (UID: \"14165edd-b69a-4886-8405-09298571b47b\") " Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.457007 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.457483 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.462818 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14165edd-b69a-4886-8405-09298571b47b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.463152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.464128 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.464457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-kube-api-access-smdnn" (OuterVolumeSpecName: "kube-api-access-smdnn") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "kube-api-access-smdnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.468847 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.479762 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14165edd-b69a-4886-8405-09298571b47b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14165edd-b69a-4886-8405-09298571b47b" (UID: "14165edd-b69a-4886-8405-09298571b47b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557288 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557335 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557353 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14165edd-b69a-4886-8405-09298571b47b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557370 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smdnn\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-kube-api-access-smdnn\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557389 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14165edd-b69a-4886-8405-09298571b47b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557405 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14165edd-b69a-4886-8405-09298571b47b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.557423 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14165edd-b69a-4886-8405-09298571b47b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.686561 4837 generic.go:334] "Generic (PLEG): container finished" podID="14165edd-b69a-4886-8405-09298571b47b" containerID="b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14" exitCode=0 Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.686630 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" event={"ID":"14165edd-b69a-4886-8405-09298571b47b","Type":"ContainerDied","Data":"b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14"} Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.686659 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.686680 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqs75" event={"ID":"14165edd-b69a-4886-8405-09298571b47b","Type":"ContainerDied","Data":"9ce717d5f660a9b662fc6a8bac7451a44b82c6c2b93de616af1d2118d51d2b64"} Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.686714 4837 scope.go:117] "RemoveContainer" containerID="b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.726258 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqs75"] Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.730679 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqs75"] Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.732512 4837 scope.go:117] "RemoveContainer" containerID="b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14" Oct 14 13:09:44 crc kubenswrapper[4837]: E1014 13:09:44.733548 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14\": container with ID starting with b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14 not found: ID does not exist" containerID="b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.733624 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14"} err="failed to get container status \"b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14\": rpc error: code = NotFound desc = could not find container \"b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14\": container with ID starting with b6aaf3494736537e49cb5b6f24e9925258d77ae6351810cd205ea1404924dd14 not found: ID does not exist" Oct 14 13:09:44 crc kubenswrapper[4837]: I1014 13:09:44.794366 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14165edd-b69a-4886-8405-09298571b47b" path="/var/lib/kubelet/pods/14165edd-b69a-4886-8405-09298571b47b/volumes" Oct 14 13:10:02 crc kubenswrapper[4837]: I1014 13:10:02.957834 4837 scope.go:117] "RemoveContainer" containerID="0866912271ca49bec8d452b3ab4ba5af143d4c35befadeab968d91ef36f158d3" Oct 14 13:10:11 crc kubenswrapper[4837]: I1014 13:10:11.139987 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:10:11 crc kubenswrapper[4837]: I1014 13:10:11.140812 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.262286 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nqrzh"] Oct 14 13:10:18 crc kubenswrapper[4837]: E1014 13:10:18.262793 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14165edd-b69a-4886-8405-09298571b47b" containerName="registry" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.262805 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="14165edd-b69a-4886-8405-09298571b47b" containerName="registry" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.262900 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="14165edd-b69a-4886-8405-09298571b47b" containerName="registry" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.263297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.264421 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-65869"] Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.264996 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-65869" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.267298 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8pf4v" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.267522 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-c9hzl" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.268375 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.273789 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.280115 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nqrzh"] Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.287389 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-65869"] Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.308485 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zktdz"] Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.309115 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.312775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zqf4\" (UniqueName: \"kubernetes.io/projected/ffee16ce-49f5-418a-b83c-64b60165f84e-kube-api-access-7zqf4\") pod \"cert-manager-cainjector-7f985d654d-nqrzh\" (UID: \"ffee16ce-49f5-418a-b83c-64b60165f84e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.312815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndsz\" (UniqueName: \"kubernetes.io/projected/f782d810-8b08-4a07-b024-0481a26cf944-kube-api-access-pndsz\") pod \"cert-manager-5b446d88c5-65869\" (UID: \"f782d810-8b08-4a07-b024-0481a26cf944\") " pod="cert-manager/cert-manager-5b446d88c5-65869" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.313072 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pgbz9" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.325933 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zktdz"] Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.414387 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lklp9\" (UniqueName: \"kubernetes.io/projected/ca647993-67e2-4c73-b529-68deed403e7f-kube-api-access-lklp9\") pod \"cert-manager-webhook-5655c58dd6-zktdz\" (UID: \"ca647993-67e2-4c73-b529-68deed403e7f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.414444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndsz\" (UniqueName: \"kubernetes.io/projected/f782d810-8b08-4a07-b024-0481a26cf944-kube-api-access-pndsz\") pod \"cert-manager-5b446d88c5-65869\" (UID: \"f782d810-8b08-4a07-b024-0481a26cf944\") " pod="cert-manager/cert-manager-5b446d88c5-65869" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.414556 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zqf4\" (UniqueName: \"kubernetes.io/projected/ffee16ce-49f5-418a-b83c-64b60165f84e-kube-api-access-7zqf4\") pod \"cert-manager-cainjector-7f985d654d-nqrzh\" (UID: \"ffee16ce-49f5-418a-b83c-64b60165f84e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.440684 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndsz\" (UniqueName: \"kubernetes.io/projected/f782d810-8b08-4a07-b024-0481a26cf944-kube-api-access-pndsz\") pod \"cert-manager-5b446d88c5-65869\" (UID: \"f782d810-8b08-4a07-b024-0481a26cf944\") " pod="cert-manager/cert-manager-5b446d88c5-65869" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.450413 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zqf4\" (UniqueName: \"kubernetes.io/projected/ffee16ce-49f5-418a-b83c-64b60165f84e-kube-api-access-7zqf4\") pod \"cert-manager-cainjector-7f985d654d-nqrzh\" (UID: \"ffee16ce-49f5-418a-b83c-64b60165f84e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.515622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lklp9\" (UniqueName: \"kubernetes.io/projected/ca647993-67e2-4c73-b529-68deed403e7f-kube-api-access-lklp9\") pod \"cert-manager-webhook-5655c58dd6-zktdz\" (UID: \"ca647993-67e2-4c73-b529-68deed403e7f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.532830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lklp9\" (UniqueName: \"kubernetes.io/projected/ca647993-67e2-4c73-b529-68deed403e7f-kube-api-access-lklp9\") pod \"cert-manager-webhook-5655c58dd6-zktdz\" (UID: \"ca647993-67e2-4c73-b529-68deed403e7f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.577460 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.587524 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-65869" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.623810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.827559 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nqrzh"] Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.839339 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.885442 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" event={"ID":"ffee16ce-49f5-418a-b83c-64b60165f84e","Type":"ContainerStarted","Data":"c387406190fa9bc95d5c6b5e5a6d59b063adeabc0b8db181e2aad23ff8bde4dc"} Oct 14 13:10:18 crc kubenswrapper[4837]: I1014 13:10:18.901994 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zktdz"] Oct 14 13:10:19 crc kubenswrapper[4837]: I1014 13:10:19.054049 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-65869"] Oct 14 13:10:19 crc kubenswrapper[4837]: W1014 13:10:19.060458 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf782d810_8b08_4a07_b024_0481a26cf944.slice/crio-8bc8c0f83366cb829a93695f47e753469df77b21db5c4e341a13ba26173e7a7c WatchSource:0}: Error finding container 8bc8c0f83366cb829a93695f47e753469df77b21db5c4e341a13ba26173e7a7c: Status 404 returned error can't find the container with id 8bc8c0f83366cb829a93695f47e753469df77b21db5c4e341a13ba26173e7a7c Oct 14 13:10:19 crc kubenswrapper[4837]: I1014 13:10:19.893673 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-65869" event={"ID":"f782d810-8b08-4a07-b024-0481a26cf944","Type":"ContainerStarted","Data":"8bc8c0f83366cb829a93695f47e753469df77b21db5c4e341a13ba26173e7a7c"} Oct 14 13:10:19 crc kubenswrapper[4837]: I1014 13:10:19.895548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" event={"ID":"ca647993-67e2-4c73-b529-68deed403e7f","Type":"ContainerStarted","Data":"e5269ab45d820ad3dcaf5735a7a6a993947bd7569090ddc08eee3a348a4c3af7"} Oct 14 13:10:21 crc kubenswrapper[4837]: I1014 13:10:21.908443 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" event={"ID":"ffee16ce-49f5-418a-b83c-64b60165f84e","Type":"ContainerStarted","Data":"e0ab29781ec013e25a66f5675da44e4aac040dd64070336e854c6e97494b00e3"} Oct 14 13:10:21 crc kubenswrapper[4837]: I1014 13:10:21.927584 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-nqrzh" podStartSLOduration=2.035353197 podStartE2EDuration="3.927565291s" podCreationTimestamp="2025-10-14 13:10:18 +0000 UTC" firstStartedPulling="2025-10-14 13:10:18.839074071 +0000 UTC m=+556.756073884" lastFinishedPulling="2025-10-14 13:10:20.731286155 +0000 UTC m=+558.648285978" observedRunningTime="2025-10-14 13:10:21.925086444 +0000 UTC m=+559.842086297" watchObservedRunningTime="2025-10-14 13:10:21.927565291 +0000 UTC m=+559.844565104" Oct 14 13:10:22 crc kubenswrapper[4837]: I1014 13:10:22.917945 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-65869" event={"ID":"f782d810-8b08-4a07-b024-0481a26cf944","Type":"ContainerStarted","Data":"78dead4da117dcbdc8b94a8c08bc47f5a4ad10bdef8bbbbcf754c6f17477bf0d"} Oct 14 13:10:22 crc kubenswrapper[4837]: I1014 13:10:22.921468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" event={"ID":"ca647993-67e2-4c73-b529-68deed403e7f","Type":"ContainerStarted","Data":"996201183f98a1cfcf97549d247e649ab64e9ecce1da5f18bb1bc082b8ff6197"} Oct 14 13:10:22 crc kubenswrapper[4837]: I1014 13:10:22.921542 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:22 crc kubenswrapper[4837]: I1014 13:10:22.936853 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-65869" podStartSLOduration=1.9928400769999999 podStartE2EDuration="4.936839653s" podCreationTimestamp="2025-10-14 13:10:18 +0000 UTC" firstStartedPulling="2025-10-14 13:10:19.062526383 +0000 UTC m=+556.979526196" lastFinishedPulling="2025-10-14 13:10:22.006525959 +0000 UTC m=+559.923525772" observedRunningTime="2025-10-14 13:10:22.931734664 +0000 UTC m=+560.848734477" watchObservedRunningTime="2025-10-14 13:10:22.936839653 +0000 UTC m=+560.853839466" Oct 14 13:10:28 crc kubenswrapper[4837]: I1014 13:10:28.629769 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" Oct 14 13:10:28 crc kubenswrapper[4837]: I1014 13:10:28.656413 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-zktdz" podStartSLOduration=7.622769371 podStartE2EDuration="10.656388704s" podCreationTimestamp="2025-10-14 13:10:18 +0000 UTC" firstStartedPulling="2025-10-14 13:10:18.911997276 +0000 UTC m=+556.828997089" lastFinishedPulling="2025-10-14 13:10:21.945616609 +0000 UTC m=+559.862616422" observedRunningTime="2025-10-14 13:10:22.946576376 +0000 UTC m=+560.863576189" watchObservedRunningTime="2025-10-14 13:10:28.656388704 +0000 UTC m=+566.573388557" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.038988 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xfw4j"] Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.039905 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-controller" containerID="cri-o://d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.040611 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="sbdb" containerID="cri-o://b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.040747 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="nbdb" containerID="cri-o://d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.040848 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="northd" containerID="cri-o://b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.040938 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.041031 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-node" containerID="cri-o://d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.041123 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-acl-logging" containerID="cri-o://817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.089600 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" containerID="cri-o://fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" gracePeriod=30 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.342586 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/3.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.346043 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovn-acl-logging/0.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.346733 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovn-controller/0.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.347140 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366150 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-node-log\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366220 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-etc-openvswitch\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366241 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-netd\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366271 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366298 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-kubelet\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366317 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-var-lib-openvswitch\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366337 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-ovn\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366365 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-slash\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-config\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366479 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-openvswitch\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-script-lib\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366545 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6t64\" (UniqueName: \"kubernetes.io/projected/f670a3c6-520c-45ba-980a-00c63703b02b-kube-api-access-j6t64\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366567 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f670a3c6-520c-45ba-980a-00c63703b02b-ovn-node-metrics-cert\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366587 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-systemd\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366612 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-ovn-kubernetes\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366645 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-env-overrides\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366663 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-netns\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-systemd-units\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366720 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-log-socket\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366744 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-bin\") pod \"f670a3c6-520c-45ba-980a-00c63703b02b\" (UID: \"f670a3c6-520c-45ba-980a-00c63703b02b\") " Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.366957 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367004 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-node-log" (OuterVolumeSpecName: "node-log") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367030 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367053 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367080 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367107 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367129 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367151 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367199 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-slash" (OuterVolumeSpecName: "host-slash") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367560 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367593 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.367876 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.368354 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.368394 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.368399 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.368417 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-log-socket" (OuterVolumeSpecName: "log-socket") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.368600 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.372966 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f670a3c6-520c-45ba-980a-00c63703b02b-kube-api-access-j6t64" (OuterVolumeSpecName: "kube-api-access-j6t64") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "kube-api-access-j6t64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.373194 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f670a3c6-520c-45ba-980a-00c63703b02b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.381670 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f670a3c6-520c-45ba-980a-00c63703b02b" (UID: "f670a3c6-520c-45ba-980a-00c63703b02b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401699 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qb8wt"] Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401881 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401894 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401904 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-acl-logging" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401909 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-acl-logging" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401918 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401924 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401931 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401937 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401944 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-ovn-metrics" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401950 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-ovn-metrics" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401960 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kubecfg-setup" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401966 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kubecfg-setup" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401974 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401979 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.401989 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-node" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.401994 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-node" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.402003 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="sbdb" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402009 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="sbdb" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.402017 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="nbdb" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402023 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="nbdb" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.402032 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="northd" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402037 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="northd" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402122 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-node" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402132 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="kube-rbac-proxy-ovn-metrics" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402138 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402147 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="northd" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402168 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402175 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovn-acl-logging" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402183 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402189 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402197 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="nbdb" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402204 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="sbdb" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402211 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.402288 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402295 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402390 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.402470 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.402477 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" containerName="ovnkube-controller" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.403731 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.467908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-systemd-units\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.467967 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-run-netns\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.467997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovn-node-metrics-cert\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468021 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-etc-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468047 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468068 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-kubelet\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468125 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-log-socket\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468183 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-env-overrides\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468202 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-systemd\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468222 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468243 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovnkube-script-lib\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468272 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-cni-bin\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468289 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovnkube-config\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-ovn\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468328 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-node-log\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468347 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lg4\" (UniqueName: \"kubernetes.io/projected/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-kube-api-access-x4lg4\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468374 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-slash\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-var-lib-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468433 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468515 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-cni-netd\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468572 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468585 4837 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468595 4837 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-log-socket\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468605 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468614 4837 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-node-log\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468625 4837 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468634 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468646 4837 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468657 4837 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468668 4837 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468678 4837 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468687 4837 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-slash\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468697 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468707 4837 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468724 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468736 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6t64\" (UniqueName: \"kubernetes.io/projected/f670a3c6-520c-45ba-980a-00c63703b02b-kube-api-access-j6t64\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468746 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f670a3c6-520c-45ba-980a-00c63703b02b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468757 4837 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468767 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f670a3c6-520c-45ba-980a-00c63703b02b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.468778 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f670a3c6-520c-45ba-980a-00c63703b02b-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-slash\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569818 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-var-lib-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569854 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-slash\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-var-lib-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569866 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-cni-netd\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.569992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-systemd-units\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-cni-netd\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570019 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-run-netns\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570036 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-systemd-units\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570039 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovn-node-metrics-cert\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570058 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-run-netns\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570065 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-etc-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570089 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570110 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-kubelet\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-log-socket\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570181 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-env-overrides\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570197 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-systemd\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovnkube-script-lib\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570262 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-cni-bin\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570279 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovnkube-config\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-ovn\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-node-log\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lg4\" (UniqueName: \"kubernetes.io/projected/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-kube-api-access-x4lg4\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570654 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-etc-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.570688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-openvswitch\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.571680 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-kubelet\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.571731 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-log-socket\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-cni-bin\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-env-overrides\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572338 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-systemd\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572361 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovnkube-config\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572754 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-run-ovn\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.572777 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-node-log\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.573426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovnkube-script-lib\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.574952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-ovn-node-metrics-cert\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.586129 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lg4\" (UniqueName: \"kubernetes.io/projected/f710caac-e1f1-4e3d-bcf8-9965cde2adb9-kube-api-access-x4lg4\") pod \"ovnkube-node-qb8wt\" (UID: \"f710caac-e1f1-4e3d-bcf8-9965cde2adb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.725444 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.962813 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerDied","Data":"3fc3932951af89f9cb97d0a220b349e6071b35dd1039a0e4dc963252be09ec0e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.962734 4837 generic.go:334] "Generic (PLEG): container finished" podID="f710caac-e1f1-4e3d-bcf8-9965cde2adb9" containerID="3fc3932951af89f9cb97d0a220b349e6071b35dd1039a0e4dc963252be09ec0e" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.963391 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"701a18a0da4f2b73d81d02aa2ed7ca00cba11b0b716ea686517cad280a5d273e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.966462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/2.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.967361 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/1.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.967419 4837 generic.go:334] "Generic (PLEG): container finished" podID="01492025-d672-4746-af22-53fa41a3f612" containerID="9b9800c7caf14c369455bcd4d508943981bd965b9cfd5812889d0c36580034f5" exitCode=2 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.967504 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerDied","Data":"9b9800c7caf14c369455bcd4d508943981bd965b9cfd5812889d0c36580034f5"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.967545 4837 scope.go:117] "RemoveContainer" containerID="01509342f358ec1c1348cf36712785470635cb1625689f3964ab592fe4887fc5" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.968124 4837 scope.go:117] "RemoveContainer" containerID="9b9800c7caf14c369455bcd4d508943981bd965b9cfd5812889d0c36580034f5" Oct 14 13:10:29 crc kubenswrapper[4837]: E1014 13:10:29.968447 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-s6qr4_openshift-multus(01492025-d672-4746-af22-53fa41a3f612)\"" pod="openshift-multus/multus-s6qr4" podUID="01492025-d672-4746-af22-53fa41a3f612" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.971108 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovnkube-controller/3.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.975893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovn-acl-logging/0.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.976633 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfw4j_f670a3c6-520c-45ba-980a-00c63703b02b/ovn-controller/0.log" Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977371 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977406 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977418 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977427 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977436 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977445 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" exitCode=0 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977454 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" exitCode=143 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977465 4837 generic.go:334] "Generic (PLEG): container finished" podID="f670a3c6-520c-45ba-980a-00c63703b02b" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" exitCode=143 Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977489 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977538 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977567 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977581 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977596 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977609 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977617 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977624 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977632 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977639 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977646 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977653 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977660 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977667 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977688 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977697 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977704 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977711 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977719 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977726 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977733 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977741 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977749 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977756 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977779 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977788 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977796 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977804 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977812 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977819 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977828 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977835 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977842 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977849 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" event={"ID":"f670a3c6-520c-45ba-980a-00c63703b02b","Type":"ContainerDied","Data":"6ce098dad1f402fc79fe5ee600058e8d4dab7229ca91c0690f68b99d6b60e4e0"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977870 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977879 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977923 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977935 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977952 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977967 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977978 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977987 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977994 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.978003 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} Oct 14 13:10:29 crc kubenswrapper[4837]: I1014 13:10:29.977583 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfw4j" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.063346 4837 scope.go:117] "RemoveContainer" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.065266 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xfw4j"] Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.078344 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xfw4j"] Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.111059 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.126206 4837 scope.go:117] "RemoveContainer" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.138337 4837 scope.go:117] "RemoveContainer" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.152015 4837 scope.go:117] "RemoveContainer" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.195055 4837 scope.go:117] "RemoveContainer" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.215324 4837 scope.go:117] "RemoveContainer" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.228484 4837 scope.go:117] "RemoveContainer" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.238009 4837 scope.go:117] "RemoveContainer" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.250768 4837 scope.go:117] "RemoveContainer" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.286614 4837 scope.go:117] "RemoveContainer" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.287124 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": container with ID starting with fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a not found: ID does not exist" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.287230 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} err="failed to get container status \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": rpc error: code = NotFound desc = could not find container \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": container with ID starting with fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.287264 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.298373 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": container with ID starting with 8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a not found: ID does not exist" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.298418 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} err="failed to get container status \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": rpc error: code = NotFound desc = could not find container \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": container with ID starting with 8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.298444 4837 scope.go:117] "RemoveContainer" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.298815 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": container with ID starting with b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3 not found: ID does not exist" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.298853 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} err="failed to get container status \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": rpc error: code = NotFound desc = could not find container \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": container with ID starting with b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.298878 4837 scope.go:117] "RemoveContainer" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.299194 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": container with ID starting with d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62 not found: ID does not exist" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.299255 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} err="failed to get container status \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": rpc error: code = NotFound desc = could not find container \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": container with ID starting with d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.299299 4837 scope.go:117] "RemoveContainer" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.299610 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": container with ID starting with b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3 not found: ID does not exist" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.299648 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} err="failed to get container status \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": rpc error: code = NotFound desc = could not find container \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": container with ID starting with b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.299672 4837 scope.go:117] "RemoveContainer" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.299870 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": container with ID starting with c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f not found: ID does not exist" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.299904 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} err="failed to get container status \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": rpc error: code = NotFound desc = could not find container \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": container with ID starting with c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.299927 4837 scope.go:117] "RemoveContainer" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.300326 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": container with ID starting with d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319 not found: ID does not exist" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.300362 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} err="failed to get container status \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": rpc error: code = NotFound desc = could not find container \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": container with ID starting with d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.300385 4837 scope.go:117] "RemoveContainer" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.300664 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": container with ID starting with 817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7 not found: ID does not exist" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.300702 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} err="failed to get container status \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": rpc error: code = NotFound desc = could not find container \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": container with ID starting with 817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.300724 4837 scope.go:117] "RemoveContainer" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.301005 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": container with ID starting with d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e not found: ID does not exist" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.301039 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} err="failed to get container status \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": rpc error: code = NotFound desc = could not find container \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": container with ID starting with d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.301061 4837 scope.go:117] "RemoveContainer" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" Oct 14 13:10:30 crc kubenswrapper[4837]: E1014 13:10:30.301344 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": container with ID starting with 9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa not found: ID does not exist" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.301371 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} err="failed to get container status \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": rpc error: code = NotFound desc = could not find container \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": container with ID starting with 9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.301389 4837 scope.go:117] "RemoveContainer" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.301670 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} err="failed to get container status \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": rpc error: code = NotFound desc = could not find container \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": container with ID starting with fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.301704 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302062 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} err="failed to get container status \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": rpc error: code = NotFound desc = could not find container \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": container with ID starting with 8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302101 4837 scope.go:117] "RemoveContainer" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302367 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} err="failed to get container status \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": rpc error: code = NotFound desc = could not find container \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": container with ID starting with b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302384 4837 scope.go:117] "RemoveContainer" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302635 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} err="failed to get container status \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": rpc error: code = NotFound desc = could not find container \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": container with ID starting with d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302667 4837 scope.go:117] "RemoveContainer" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302869 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} err="failed to get container status \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": rpc error: code = NotFound desc = could not find container \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": container with ID starting with b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.302886 4837 scope.go:117] "RemoveContainer" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303171 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} err="failed to get container status \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": rpc error: code = NotFound desc = could not find container \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": container with ID starting with c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303240 4837 scope.go:117] "RemoveContainer" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303556 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} err="failed to get container status \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": rpc error: code = NotFound desc = could not find container \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": container with ID starting with d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303584 4837 scope.go:117] "RemoveContainer" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303782 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} err="failed to get container status \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": rpc error: code = NotFound desc = could not find container \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": container with ID starting with 817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303800 4837 scope.go:117] "RemoveContainer" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303970 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} err="failed to get container status \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": rpc error: code = NotFound desc = could not find container \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": container with ID starting with d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.303989 4837 scope.go:117] "RemoveContainer" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304240 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} err="failed to get container status \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": rpc error: code = NotFound desc = could not find container \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": container with ID starting with 9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304271 4837 scope.go:117] "RemoveContainer" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304496 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} err="failed to get container status \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": rpc error: code = NotFound desc = could not find container \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": container with ID starting with fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304514 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304724 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} err="failed to get container status \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": rpc error: code = NotFound desc = could not find container \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": container with ID starting with 8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304763 4837 scope.go:117] "RemoveContainer" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304942 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} err="failed to get container status \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": rpc error: code = NotFound desc = could not find container \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": container with ID starting with b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.304962 4837 scope.go:117] "RemoveContainer" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305110 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} err="failed to get container status \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": rpc error: code = NotFound desc = could not find container \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": container with ID starting with d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305124 4837 scope.go:117] "RemoveContainer" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305352 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} err="failed to get container status \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": rpc error: code = NotFound desc = could not find container \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": container with ID starting with b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305394 4837 scope.go:117] "RemoveContainer" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305566 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} err="failed to get container status \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": rpc error: code = NotFound desc = could not find container \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": container with ID starting with c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305583 4837 scope.go:117] "RemoveContainer" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305822 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} err="failed to get container status \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": rpc error: code = NotFound desc = could not find container \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": container with ID starting with d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.305853 4837 scope.go:117] "RemoveContainer" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306089 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} err="failed to get container status \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": rpc error: code = NotFound desc = could not find container \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": container with ID starting with 817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306105 4837 scope.go:117] "RemoveContainer" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306295 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} err="failed to get container status \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": rpc error: code = NotFound desc = could not find container \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": container with ID starting with d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306312 4837 scope.go:117] "RemoveContainer" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306533 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} err="failed to get container status \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": rpc error: code = NotFound desc = could not find container \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": container with ID starting with 9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306593 4837 scope.go:117] "RemoveContainer" containerID="fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306808 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a"} err="failed to get container status \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": rpc error: code = NotFound desc = could not find container \"fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a\": container with ID starting with fa2ce935ca6a27958519630ddf20d3fa98baa77732efb23f6b555c04abe97d8a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306825 4837 scope.go:117] "RemoveContainer" containerID="8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.306991 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a"} err="failed to get container status \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": rpc error: code = NotFound desc = could not find container \"8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a\": container with ID starting with 8d39d915090e0ec11038fc0e503e4dee60421992248327153fcdfb356fde564a not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307008 4837 scope.go:117] "RemoveContainer" containerID="b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307147 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3"} err="failed to get container status \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": rpc error: code = NotFound desc = could not find container \"b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3\": container with ID starting with b01af2b9917c562c7cbca20f338600140a71af7880e30e30586853b8c26f8ec3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307176 4837 scope.go:117] "RemoveContainer" containerID="d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307352 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62"} err="failed to get container status \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": rpc error: code = NotFound desc = could not find container \"d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62\": container with ID starting with d7a868f08b0be2f2090fa48ec2ae4cb49db8b28fe855b535455990e134babc62 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307389 4837 scope.go:117] "RemoveContainer" containerID="b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307643 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3"} err="failed to get container status \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": rpc error: code = NotFound desc = could not find container \"b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3\": container with ID starting with b6039f83662aee4d662c12193f353ca089e693963fe11e72e12c83580b3733d3 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307661 4837 scope.go:117] "RemoveContainer" containerID="c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307897 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f"} err="failed to get container status \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": rpc error: code = NotFound desc = could not find container \"c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f\": container with ID starting with c244cf3e9ab8e10c7fc1fa777a62eae4f70b1fed7c1dd2f37658961b75a8167f not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.307952 4837 scope.go:117] "RemoveContainer" containerID="d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308171 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319"} err="failed to get container status \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": rpc error: code = NotFound desc = could not find container \"d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319\": container with ID starting with d0371fb71da80cd5167937d2ffb77fc00d9737dae632a5be79743c3c59b8d319 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308189 4837 scope.go:117] "RemoveContainer" containerID="817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308398 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7"} err="failed to get container status \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": rpc error: code = NotFound desc = could not find container \"817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7\": container with ID starting with 817bdc34b5e14056c7f8d3fb4ec8cfdc2fa53503fe5401eed50ab59df003d5c7 not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308437 4837 scope.go:117] "RemoveContainer" containerID="d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308637 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e"} err="failed to get container status \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": rpc error: code = NotFound desc = could not find container \"d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e\": container with ID starting with d44a083d0bc6689cc60b45dbd04115171877bfcc9d04d5f21d59d2c3fd88d67e not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308657 4837 scope.go:117] "RemoveContainer" containerID="9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.308856 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa"} err="failed to get container status \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": rpc error: code = NotFound desc = could not find container \"9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa\": container with ID starting with 9b3497f1d43393346047d7a12a88de6a4cd38b212b69878b8b6dc88f5fc79aaa not found: ID does not exist" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.799151 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f670a3c6-520c-45ba-980a-00c63703b02b" path="/var/lib/kubelet/pods/f670a3c6-520c-45ba-980a-00c63703b02b/volumes" Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.987834 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"8314e9cbab97e955ef557a53f3b27976961fe1ff356dba8e022136c06c1255dd"} Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.988271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"4553552f1684a396f559d455781f34e127842702afe4e5db2b83b28597cc2c9a"} Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.988307 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"003ddd2a51903f84f608ceaedad18fb7b163baa6b9ee75a04d71c275884e9688"} Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.988328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"f4350129ab6cc32cbfbe6b60bdf7267fef96502e3086fa81b8afa5b8081b7a74"} Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.988349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"c4a29a03de152e4e3419e828cf0f6a932d5e431de84689a0688f36795043adc1"} Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.988367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"a80b0fcb822af94a68c2287297feb3d5ed4eb94c8e3039933b94c73e6911f5ad"} Oct 14 13:10:30 crc kubenswrapper[4837]: I1014 13:10:30.990089 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/2.log" Oct 14 13:10:34 crc kubenswrapper[4837]: I1014 13:10:34.031504 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"dafca70fe0e4f7f343c975df19a4b860ad93879abcf626a31362d51ec6d91e3e"} Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.046803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" event={"ID":"f710caac-e1f1-4e3d-bcf8-9965cde2adb9","Type":"ContainerStarted","Data":"3c700122b65911c94885305e1896f4dbb294fe38db4aec240c5fe2c2e9078b4a"} Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.048930 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.048976 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.049053 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.100464 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" podStartSLOduration=7.100436955 podStartE2EDuration="7.100436955s" podCreationTimestamp="2025-10-14 13:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:10:36.094574537 +0000 UTC m=+574.011574380" watchObservedRunningTime="2025-10-14 13:10:36.100436955 +0000 UTC m=+574.017436808" Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.101882 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:36 crc kubenswrapper[4837]: I1014 13:10:36.106086 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:10:40 crc kubenswrapper[4837]: I1014 13:10:40.785412 4837 scope.go:117] "RemoveContainer" containerID="9b9800c7caf14c369455bcd4d508943981bd965b9cfd5812889d0c36580034f5" Oct 14 13:10:40 crc kubenswrapper[4837]: E1014 13:10:40.786239 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-s6qr4_openshift-multus(01492025-d672-4746-af22-53fa41a3f612)\"" pod="openshift-multus/multus-s6qr4" podUID="01492025-d672-4746-af22-53fa41a3f612" Oct 14 13:10:41 crc kubenswrapper[4837]: I1014 13:10:41.139834 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:10:41 crc kubenswrapper[4837]: I1014 13:10:41.140262 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:10:41 crc kubenswrapper[4837]: I1014 13:10:41.140326 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:10:41 crc kubenswrapper[4837]: I1014 13:10:41.141053 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9df650ea9a0889b5303a141ba1c69bbbdcc6bf28d1e1e51c58ad3e80e0c7622"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:10:41 crc kubenswrapper[4837]: I1014 13:10:41.141195 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://c9df650ea9a0889b5303a141ba1c69bbbdcc6bf28d1e1e51c58ad3e80e0c7622" gracePeriod=600 Oct 14 13:10:42 crc kubenswrapper[4837]: I1014 13:10:42.092347 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="c9df650ea9a0889b5303a141ba1c69bbbdcc6bf28d1e1e51c58ad3e80e0c7622" exitCode=0 Oct 14 13:10:42 crc kubenswrapper[4837]: I1014 13:10:42.092471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"c9df650ea9a0889b5303a141ba1c69bbbdcc6bf28d1e1e51c58ad3e80e0c7622"} Oct 14 13:10:42 crc kubenswrapper[4837]: I1014 13:10:42.092913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"a9c5da248ef4f304e8c83104496af5297a77f5eb3df38f2188353642fbfdb087"} Oct 14 13:10:42 crc kubenswrapper[4837]: I1014 13:10:42.092949 4837 scope.go:117] "RemoveContainer" containerID="b54de4ef780d289e7d2d626fe6e6ffa01ff36d90064fef6b767f17c485b0e770" Oct 14 13:10:53 crc kubenswrapper[4837]: I1014 13:10:53.784464 4837 scope.go:117] "RemoveContainer" containerID="9b9800c7caf14c369455bcd4d508943981bd965b9cfd5812889d0c36580034f5" Oct 14 13:10:54 crc kubenswrapper[4837]: I1014 13:10:54.178909 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6qr4_01492025-d672-4746-af22-53fa41a3f612/kube-multus/2.log" Oct 14 13:10:54 crc kubenswrapper[4837]: I1014 13:10:54.179271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6qr4" event={"ID":"01492025-d672-4746-af22-53fa41a3f612","Type":"ContainerStarted","Data":"1ef826dfa1ed6b08d9dffbf2a341a54ec7e13c3378569e2369c86674ba4d0259"} Oct 14 13:10:59 crc kubenswrapper[4837]: I1014 13:10:59.753578 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qb8wt" Oct 14 13:11:03 crc kubenswrapper[4837]: I1014 13:11:03.023378 4837 scope.go:117] "RemoveContainer" containerID="1025f4d0102daf1c6c0ba3652e718896497c62c653d0b77f542a7bae643b35e8" Oct 14 13:11:03 crc kubenswrapper[4837]: I1014 13:11:03.050178 4837 scope.go:117] "RemoveContainer" containerID="ccc385c2d1cc65f766d3d40ab5801f954329b67a51a2a92be263a66d362600aa" Oct 14 13:11:08 crc kubenswrapper[4837]: I1014 13:11:08.909199 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv"] Oct 14 13:11:08 crc kubenswrapper[4837]: I1014 13:11:08.910948 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:08 crc kubenswrapper[4837]: I1014 13:11:08.923655 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv"] Oct 14 13:11:08 crc kubenswrapper[4837]: I1014 13:11:08.960401 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.014175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.014239 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tztg\" (UniqueName: \"kubernetes.io/projected/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-kube-api-access-2tztg\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.014627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.116665 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.116825 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.116875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tztg\" (UniqueName: \"kubernetes.io/projected/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-kube-api-access-2tztg\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.117382 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.117525 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.149035 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tztg\" (UniqueName: \"kubernetes.io/projected/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-kube-api-access-2tztg\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.270026 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:09 crc kubenswrapper[4837]: I1014 13:11:09.758337 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv"] Oct 14 13:11:09 crc kubenswrapper[4837]: W1014 13:11:09.773408 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod306a4a1e_e6b7_4efb_aeba_2b570be7a5e6.slice/crio-35e67da758b7457c26c8e6accda2a3fe89500b6a06d2ee598422f12e17a6e6a4 WatchSource:0}: Error finding container 35e67da758b7457c26c8e6accda2a3fe89500b6a06d2ee598422f12e17a6e6a4: Status 404 returned error can't find the container with id 35e67da758b7457c26c8e6accda2a3fe89500b6a06d2ee598422f12e17a6e6a4 Oct 14 13:11:10 crc kubenswrapper[4837]: I1014 13:11:10.285565 4837 generic.go:334] "Generic (PLEG): container finished" podID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerID="d2927272bc7c70eeff4ba9ff5e6371359082f97e30e59f5716b38cc9f3e9bd9f" exitCode=0 Oct 14 13:11:10 crc kubenswrapper[4837]: I1014 13:11:10.285675 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" event={"ID":"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6","Type":"ContainerDied","Data":"d2927272bc7c70eeff4ba9ff5e6371359082f97e30e59f5716b38cc9f3e9bd9f"} Oct 14 13:11:10 crc kubenswrapper[4837]: I1014 13:11:10.286038 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" event={"ID":"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6","Type":"ContainerStarted","Data":"35e67da758b7457c26c8e6accda2a3fe89500b6a06d2ee598422f12e17a6e6a4"} Oct 14 13:11:12 crc kubenswrapper[4837]: I1014 13:11:12.306067 4837 generic.go:334] "Generic (PLEG): container finished" podID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerID="21c1966bd832498af2a6019e88efe7448c4eae5e0fe67a4361de0543a382b1f4" exitCode=0 Oct 14 13:11:12 crc kubenswrapper[4837]: I1014 13:11:12.306288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" event={"ID":"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6","Type":"ContainerDied","Data":"21c1966bd832498af2a6019e88efe7448c4eae5e0fe67a4361de0543a382b1f4"} Oct 14 13:11:13 crc kubenswrapper[4837]: I1014 13:11:13.316532 4837 generic.go:334] "Generic (PLEG): container finished" podID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerID="13c49473d2e46978cd13176cafdbfc783e9dc6e130f78629183b82bf52a04871" exitCode=0 Oct 14 13:11:13 crc kubenswrapper[4837]: I1014 13:11:13.316588 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" event={"ID":"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6","Type":"ContainerDied","Data":"13c49473d2e46978cd13176cafdbfc783e9dc6e130f78629183b82bf52a04871"} Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.714273 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.897499 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-util\") pod \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.898350 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tztg\" (UniqueName: \"kubernetes.io/projected/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-kube-api-access-2tztg\") pod \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.898516 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-bundle\") pod \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\" (UID: \"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6\") " Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.899498 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-bundle" (OuterVolumeSpecName: "bundle") pod "306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" (UID: "306a4a1e-e6b7-4efb-aeba-2b570be7a5e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.909397 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-kube-api-access-2tztg" (OuterVolumeSpecName: "kube-api-access-2tztg") pod "306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" (UID: "306a4a1e-e6b7-4efb-aeba-2b570be7a5e6"). InnerVolumeSpecName "kube-api-access-2tztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:11:14 crc kubenswrapper[4837]: I1014 13:11:14.919764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-util" (OuterVolumeSpecName: "util") pod "306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" (UID: "306a4a1e-e6b7-4efb-aeba-2b570be7a5e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:11:15 crc kubenswrapper[4837]: I1014 13:11:15.000362 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-util\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:15 crc kubenswrapper[4837]: I1014 13:11:15.000407 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tztg\" (UniqueName: \"kubernetes.io/projected/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-kube-api-access-2tztg\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:15 crc kubenswrapper[4837]: I1014 13:11:15.000427 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/306a4a1e-e6b7-4efb-aeba-2b570be7a5e6-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:15 crc kubenswrapper[4837]: I1014 13:11:15.337369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" event={"ID":"306a4a1e-e6b7-4efb-aeba-2b570be7a5e6","Type":"ContainerDied","Data":"35e67da758b7457c26c8e6accda2a3fe89500b6a06d2ee598422f12e17a6e6a4"} Oct 14 13:11:15 crc kubenswrapper[4837]: I1014 13:11:15.337429 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e67da758b7457c26c8e6accda2a3fe89500b6a06d2ee598422f12e17a6e6a4" Oct 14 13:11:15 crc kubenswrapper[4837]: I1014 13:11:15.337878 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.360273 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-qmtct"] Oct 14 13:11:16 crc kubenswrapper[4837]: E1014 13:11:16.360529 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="pull" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.360544 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="pull" Oct 14 13:11:16 crc kubenswrapper[4837]: E1014 13:11:16.360560 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="util" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.360568 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="util" Oct 14 13:11:16 crc kubenswrapper[4837]: E1014 13:11:16.360580 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="extract" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.360589 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="extract" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.360705 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="306a4a1e-e6b7-4efb-aeba-2b570be7a5e6" containerName="extract" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.361133 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.362607 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6k4zw" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.362706 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.362939 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.374353 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-qmtct"] Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.419753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjt2\" (UniqueName: \"kubernetes.io/projected/a0b26320-e880-47dc-8ead-5b4547870db1-kube-api-access-9jjt2\") pod \"nmstate-operator-858ddd8f98-qmtct\" (UID: \"a0b26320-e880-47dc-8ead-5b4547870db1\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.520846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjt2\" (UniqueName: \"kubernetes.io/projected/a0b26320-e880-47dc-8ead-5b4547870db1-kube-api-access-9jjt2\") pod \"nmstate-operator-858ddd8f98-qmtct\" (UID: \"a0b26320-e880-47dc-8ead-5b4547870db1\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.540021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjt2\" (UniqueName: \"kubernetes.io/projected/a0b26320-e880-47dc-8ead-5b4547870db1-kube-api-access-9jjt2\") pod \"nmstate-operator-858ddd8f98-qmtct\" (UID: \"a0b26320-e880-47dc-8ead-5b4547870db1\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" Oct 14 13:11:16 crc kubenswrapper[4837]: I1014 13:11:16.675129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" Oct 14 13:11:17 crc kubenswrapper[4837]: I1014 13:11:17.001966 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-qmtct"] Oct 14 13:11:17 crc kubenswrapper[4837]: W1014 13:11:17.012418 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b26320_e880_47dc_8ead_5b4547870db1.slice/crio-8950471f8ad6e8db19e7f47dae60ffa3a39b131b565df5caf61ed6d5f5ea10ba WatchSource:0}: Error finding container 8950471f8ad6e8db19e7f47dae60ffa3a39b131b565df5caf61ed6d5f5ea10ba: Status 404 returned error can't find the container with id 8950471f8ad6e8db19e7f47dae60ffa3a39b131b565df5caf61ed6d5f5ea10ba Oct 14 13:11:17 crc kubenswrapper[4837]: I1014 13:11:17.348476 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" event={"ID":"a0b26320-e880-47dc-8ead-5b4547870db1","Type":"ContainerStarted","Data":"8950471f8ad6e8db19e7f47dae60ffa3a39b131b565df5caf61ed6d5f5ea10ba"} Oct 14 13:11:20 crc kubenswrapper[4837]: I1014 13:11:20.367689 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" event={"ID":"a0b26320-e880-47dc-8ead-5b4547870db1","Type":"ContainerStarted","Data":"e550280884501818fa178717e02187fd35991c1d881265151766e593cbb833c1"} Oct 14 13:11:20 crc kubenswrapper[4837]: I1014 13:11:20.389928 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-qmtct" podStartSLOduration=1.92556281 podStartE2EDuration="4.389905384s" podCreationTimestamp="2025-10-14 13:11:16 +0000 UTC" firstStartedPulling="2025-10-14 13:11:17.014507098 +0000 UTC m=+614.931506921" lastFinishedPulling="2025-10-14 13:11:19.478849682 +0000 UTC m=+617.395849495" observedRunningTime="2025-10-14 13:11:20.387470348 +0000 UTC m=+618.304470211" watchObservedRunningTime="2025-10-14 13:11:20.389905384 +0000 UTC m=+618.306905237" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.523624 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.524569 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.527513 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gd97c" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.536368 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.557418 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.558205 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.562900 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.576907 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.589367 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zpg5h"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.590218 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.675967 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.676672 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.681176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-94gbs" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.681272 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.682149 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689678 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-ovs-socket\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhrhs\" (UniqueName: \"kubernetes.io/projected/2e3f42bf-7e0b-4969-8b2e-0479072f35a4-kube-api-access-rhrhs\") pod \"nmstate-webhook-6cdbc54649-dktjp\" (UID: \"2e3f42bf-7e0b-4969-8b2e-0479072f35a4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689757 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wtx\" (UniqueName: \"kubernetes.io/projected/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-kube-api-access-b8wtx\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689779 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-nmstate-lock\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689796 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-dbus-socket\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689824 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2e3f42bf-7e0b-4969-8b2e-0479072f35a4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dktjp\" (UID: \"2e3f42bf-7e0b-4969-8b2e-0479072f35a4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.689977 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8wbl\" (UniqueName: \"kubernetes.io/projected/fe630318-04d6-4ba7-98d4-004f61f9e801-kube-api-access-z8wbl\") pod \"nmstate-metrics-fdff9cb8d-2fxhs\" (UID: \"fe630318-04d6-4ba7-98d4-004f61f9e801\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.716195 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.791717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e30fca9-8930-4438-baeb-6cd8437d808e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.791782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2e3f42bf-7e0b-4969-8b2e-0479072f35a4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dktjp\" (UID: \"2e3f42bf-7e0b-4969-8b2e-0479072f35a4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.791840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8wbl\" (UniqueName: \"kubernetes.io/projected/fe630318-04d6-4ba7-98d4-004f61f9e801-kube-api-access-z8wbl\") pod \"nmstate-metrics-fdff9cb8d-2fxhs\" (UID: \"fe630318-04d6-4ba7-98d4-004f61f9e801\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-ovs-socket\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792267 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e30fca9-8930-4438-baeb-6cd8437d808e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhrhs\" (UniqueName: \"kubernetes.io/projected/2e3f42bf-7e0b-4969-8b2e-0479072f35a4-kube-api-access-rhrhs\") pod \"nmstate-webhook-6cdbc54649-dktjp\" (UID: \"2e3f42bf-7e0b-4969-8b2e-0479072f35a4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wtx\" (UniqueName: \"kubernetes.io/projected/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-kube-api-access-b8wtx\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792350 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb8vf\" (UniqueName: \"kubernetes.io/projected/8e30fca9-8930-4438-baeb-6cd8437d808e-kube-api-access-jb8vf\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-nmstate-lock\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792397 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-dbus-socket\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-dbus-socket\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-nmstate-lock\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.792987 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-ovs-socket\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.798396 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2e3f42bf-7e0b-4969-8b2e-0479072f35a4-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dktjp\" (UID: \"2e3f42bf-7e0b-4969-8b2e-0479072f35a4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.826843 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8wbl\" (UniqueName: \"kubernetes.io/projected/fe630318-04d6-4ba7-98d4-004f61f9e801-kube-api-access-z8wbl\") pod \"nmstate-metrics-fdff9cb8d-2fxhs\" (UID: \"fe630318-04d6-4ba7-98d4-004f61f9e801\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.834584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wtx\" (UniqueName: \"kubernetes.io/projected/f596383f-8fd9-42cc-9554-8cfac0f1cbeb-kube-api-access-b8wtx\") pod \"nmstate-handler-zpg5h\" (UID: \"f596383f-8fd9-42cc-9554-8cfac0f1cbeb\") " pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.840907 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhrhs\" (UniqueName: \"kubernetes.io/projected/2e3f42bf-7e0b-4969-8b2e-0479072f35a4-kube-api-access-rhrhs\") pod \"nmstate-webhook-6cdbc54649-dktjp\" (UID: \"2e3f42bf-7e0b-4969-8b2e-0479072f35a4\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.848643 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.871938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.895060 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e30fca9-8930-4438-baeb-6cd8437d808e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.895676 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb8vf\" (UniqueName: \"kubernetes.io/projected/8e30fca9-8930-4438-baeb-6cd8437d808e-kube-api-access-jb8vf\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.895794 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e30fca9-8930-4438-baeb-6cd8437d808e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: E1014 13:11:21.895970 4837 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 14 13:11:21 crc kubenswrapper[4837]: E1014 13:11:21.896054 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e30fca9-8930-4438-baeb-6cd8437d808e-plugin-serving-cert podName:8e30fca9-8930-4438-baeb-6cd8437d808e nodeName:}" failed. No retries permitted until 2025-10-14 13:11:22.396031363 +0000 UTC m=+620.313031166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8e30fca9-8930-4438-baeb-6cd8437d808e-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-5jz9k" (UID: "8e30fca9-8930-4438-baeb-6cd8437d808e") : secret "plugin-serving-cert" not found Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.896347 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e30fca9-8930-4438-baeb-6cd8437d808e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.905600 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.919780 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bbcd5d5b-7zx4g"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.920914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.924514 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bbcd5d5b-7zx4g"] Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.927454 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb8vf\" (UniqueName: \"kubernetes.io/projected/8e30fca9-8930-4438-baeb-6cd8437d808e-kube-api-access-jb8vf\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:21 crc kubenswrapper[4837]: W1014 13:11:21.957305 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf596383f_8fd9_42cc_9554_8cfac0f1cbeb.slice/crio-4e8bd122b0134a12e77af8a9f65fc83a7680aba5f1c43992e5af659a7e6b8b6e WatchSource:0}: Error finding container 4e8bd122b0134a12e77af8a9f65fc83a7680aba5f1c43992e5af659a7e6b8b6e: Status 404 returned error can't find the container with id 4e8bd122b0134a12e77af8a9f65fc83a7680aba5f1c43992e5af659a7e6b8b6e Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997759 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-oauth-serving-cert\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997792 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5hrb\" (UniqueName: \"kubernetes.io/projected/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-kube-api-access-k5hrb\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-oauth-config\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-config\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997876 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-service-ca\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997890 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-trusted-ca-bundle\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:21 crc kubenswrapper[4837]: I1014 13:11:21.997907 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-serving-cert\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.098854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-oauth-serving-cert\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.098928 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5hrb\" (UniqueName: \"kubernetes.io/projected/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-kube-api-access-k5hrb\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.099030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-oauth-config\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.099064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-config\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.099105 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-service-ca\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.099142 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-trusted-ca-bundle\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.099214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-serving-cert\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.100060 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-oauth-serving-cert\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.100134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-service-ca\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.100190 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-config\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.101482 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-trusted-ca-bundle\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.104651 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-oauth-config\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.104934 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-console-serving-cert\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.116647 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5hrb\" (UniqueName: \"kubernetes.io/projected/e4ad3be1-0af4-47d6-99c8-3e350a9e9198-kube-api-access-k5hrb\") pod \"console-bbcd5d5b-7zx4g\" (UID: \"e4ad3be1-0af4-47d6-99c8-3e350a9e9198\") " pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.241267 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.322211 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs"] Oct 14 13:11:22 crc kubenswrapper[4837]: W1014 13:11:22.331548 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe630318_04d6_4ba7_98d4_004f61f9e801.slice/crio-aa678a5354d0915838f9c69b50223cb5bc2f1e107988c5025321b6eb4ea3f8de WatchSource:0}: Error finding container aa678a5354d0915838f9c69b50223cb5bc2f1e107988c5025321b6eb4ea3f8de: Status 404 returned error can't find the container with id aa678a5354d0915838f9c69b50223cb5bc2f1e107988c5025321b6eb4ea3f8de Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.332882 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp"] Oct 14 13:11:22 crc kubenswrapper[4837]: W1014 13:11:22.337002 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e3f42bf_7e0b_4969_8b2e_0479072f35a4.slice/crio-eec41640ba6449f2e383b8641e3a4091354a964d5a6b2208a6e7a5f74dab5ae0 WatchSource:0}: Error finding container eec41640ba6449f2e383b8641e3a4091354a964d5a6b2208a6e7a5f74dab5ae0: Status 404 returned error can't find the container with id eec41640ba6449f2e383b8641e3a4091354a964d5a6b2208a6e7a5f74dab5ae0 Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.390723 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" event={"ID":"fe630318-04d6-4ba7-98d4-004f61f9e801","Type":"ContainerStarted","Data":"aa678a5354d0915838f9c69b50223cb5bc2f1e107988c5025321b6eb4ea3f8de"} Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.392553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zpg5h" event={"ID":"f596383f-8fd9-42cc-9554-8cfac0f1cbeb","Type":"ContainerStarted","Data":"4e8bd122b0134a12e77af8a9f65fc83a7680aba5f1c43992e5af659a7e6b8b6e"} Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.394102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" event={"ID":"2e3f42bf-7e0b-4969-8b2e-0479072f35a4","Type":"ContainerStarted","Data":"eec41640ba6449f2e383b8641e3a4091354a964d5a6b2208a6e7a5f74dab5ae0"} Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.402550 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e30fca9-8930-4438-baeb-6cd8437d808e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.408126 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e30fca9-8930-4438-baeb-6cd8437d808e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-5jz9k\" (UID: \"8e30fca9-8930-4438-baeb-6cd8437d808e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.478065 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bbcd5d5b-7zx4g"] Oct 14 13:11:22 crc kubenswrapper[4837]: W1014 13:11:22.488143 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ad3be1_0af4_47d6_99c8_3e350a9e9198.slice/crio-751fd0155b3e35f4221f12d80158324622c9fadb87c37667ec135367f2c2c909 WatchSource:0}: Error finding container 751fd0155b3e35f4221f12d80158324622c9fadb87c37667ec135367f2c2c909: Status 404 returned error can't find the container with id 751fd0155b3e35f4221f12d80158324622c9fadb87c37667ec135367f2c2c909 Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.589102 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" Oct 14 13:11:22 crc kubenswrapper[4837]: I1014 13:11:22.856506 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k"] Oct 14 13:11:22 crc kubenswrapper[4837]: W1014 13:11:22.865087 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e30fca9_8930_4438_baeb_6cd8437d808e.slice/crio-7ec3d297c332f8e91c645ead666fceb64c96e069ea723ea28c4a5b5bf4a14f9f WatchSource:0}: Error finding container 7ec3d297c332f8e91c645ead666fceb64c96e069ea723ea28c4a5b5bf4a14f9f: Status 404 returned error can't find the container with id 7ec3d297c332f8e91c645ead666fceb64c96e069ea723ea28c4a5b5bf4a14f9f Oct 14 13:11:23 crc kubenswrapper[4837]: I1014 13:11:23.404682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" event={"ID":"8e30fca9-8930-4438-baeb-6cd8437d808e","Type":"ContainerStarted","Data":"7ec3d297c332f8e91c645ead666fceb64c96e069ea723ea28c4a5b5bf4a14f9f"} Oct 14 13:11:23 crc kubenswrapper[4837]: I1014 13:11:23.407417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bbcd5d5b-7zx4g" event={"ID":"e4ad3be1-0af4-47d6-99c8-3e350a9e9198","Type":"ContainerStarted","Data":"bd8edc43fe0dbe93e8f6997cb7844338ca295e17e05e9d45fd86249669a1e76c"} Oct 14 13:11:23 crc kubenswrapper[4837]: I1014 13:11:23.407477 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bbcd5d5b-7zx4g" event={"ID":"e4ad3be1-0af4-47d6-99c8-3e350a9e9198","Type":"ContainerStarted","Data":"751fd0155b3e35f4221f12d80158324622c9fadb87c37667ec135367f2c2c909"} Oct 14 13:11:23 crc kubenswrapper[4837]: I1014 13:11:23.432517 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bbcd5d5b-7zx4g" podStartSLOduration=2.432492438 podStartE2EDuration="2.432492438s" podCreationTimestamp="2025-10-14 13:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:11:23.422022823 +0000 UTC m=+621.339022676" watchObservedRunningTime="2025-10-14 13:11:23.432492438 +0000 UTC m=+621.349492281" Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.425555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zpg5h" event={"ID":"f596383f-8fd9-42cc-9554-8cfac0f1cbeb","Type":"ContainerStarted","Data":"109841c929006246b2d9df4341f920a1b86dfcf19d3297ec7d94386efea48ad5"} Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.426107 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.430656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" event={"ID":"8e30fca9-8930-4438-baeb-6cd8437d808e","Type":"ContainerStarted","Data":"37dafb7a3fb2d62b45bdc99c4a60fbd39a5c1ff569732d1416ab355cc71370a7"} Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.433504 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" event={"ID":"2e3f42bf-7e0b-4969-8b2e-0479072f35a4","Type":"ContainerStarted","Data":"3f124249c77b8ad2aa4bae07bc706ad746aac3c7f98808d7e0a86ac7c553a3cc"} Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.433877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.436391 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" event={"ID":"fe630318-04d6-4ba7-98d4-004f61f9e801","Type":"ContainerStarted","Data":"e5edb60eeb6a2a30043a4ce54d05e5bbb4d8afb83d3ebf1cbe9c210a0ae3751f"} Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.470777 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zpg5h" podStartSLOduration=1.822270528 podStartE2EDuration="5.470757234s" podCreationTimestamp="2025-10-14 13:11:21 +0000 UTC" firstStartedPulling="2025-10-14 13:11:21.963381924 +0000 UTC m=+619.880381737" lastFinishedPulling="2025-10-14 13:11:25.61186863 +0000 UTC m=+623.528868443" observedRunningTime="2025-10-14 13:11:26.445107937 +0000 UTC m=+624.362107770" watchObservedRunningTime="2025-10-14 13:11:26.470757234 +0000 UTC m=+624.387757047" Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.474038 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" podStartSLOduration=2.250668829 podStartE2EDuration="5.474021424s" podCreationTimestamp="2025-10-14 13:11:21 +0000 UTC" firstStartedPulling="2025-10-14 13:11:22.345233278 +0000 UTC m=+620.262233121" lastFinishedPulling="2025-10-14 13:11:25.568585873 +0000 UTC m=+623.485585716" observedRunningTime="2025-10-14 13:11:26.468470163 +0000 UTC m=+624.385469996" watchObservedRunningTime="2025-10-14 13:11:26.474021424 +0000 UTC m=+624.391021237" Oct 14 13:11:26 crc kubenswrapper[4837]: I1014 13:11:26.498995 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-5jz9k" podStartSLOduration=2.803456596 podStartE2EDuration="5.497657745s" podCreationTimestamp="2025-10-14 13:11:21 +0000 UTC" firstStartedPulling="2025-10-14 13:11:22.866816429 +0000 UTC m=+620.783816232" lastFinishedPulling="2025-10-14 13:11:25.561017548 +0000 UTC m=+623.478017381" observedRunningTime="2025-10-14 13:11:26.494666784 +0000 UTC m=+624.411666607" watchObservedRunningTime="2025-10-14 13:11:26.497657745 +0000 UTC m=+624.414657568" Oct 14 13:11:28 crc kubenswrapper[4837]: I1014 13:11:28.448549 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" event={"ID":"fe630318-04d6-4ba7-98d4-004f61f9e801","Type":"ContainerStarted","Data":"819b2d7ee71e68afb145bd20d0d4f4d5ff180110c04658791da76416cab486cd"} Oct 14 13:11:28 crc kubenswrapper[4837]: I1014 13:11:28.477457 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2fxhs" podStartSLOduration=1.9791236410000002 podStartE2EDuration="7.477429304s" podCreationTimestamp="2025-10-14 13:11:21 +0000 UTC" firstStartedPulling="2025-10-14 13:11:22.333824488 +0000 UTC m=+620.250824341" lastFinishedPulling="2025-10-14 13:11:27.832130191 +0000 UTC m=+625.749130004" observedRunningTime="2025-10-14 13:11:28.47656737 +0000 UTC m=+626.393567233" watchObservedRunningTime="2025-10-14 13:11:28.477429304 +0000 UTC m=+626.394429157" Oct 14 13:11:31 crc kubenswrapper[4837]: I1014 13:11:31.944450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zpg5h" Oct 14 13:11:32 crc kubenswrapper[4837]: I1014 13:11:32.241542 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:32 crc kubenswrapper[4837]: I1014 13:11:32.241912 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:32 crc kubenswrapper[4837]: I1014 13:11:32.250684 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:32 crc kubenswrapper[4837]: I1014 13:11:32.485483 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bbcd5d5b-7zx4g" Oct 14 13:11:32 crc kubenswrapper[4837]: I1014 13:11:32.551611 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6vd5d"] Oct 14 13:11:41 crc kubenswrapper[4837]: I1014 13:11:41.879907 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dktjp" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.187633 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw"] Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.189508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.191729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.200305 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw"] Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.260240 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.260561 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2n7g\" (UniqueName: \"kubernetes.io/projected/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-kube-api-access-c2n7g\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.261309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.362458 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2n7g\" (UniqueName: \"kubernetes.io/projected/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-kube-api-access-c2n7g\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.362534 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.362576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.363141 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.363385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.385069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2n7g\" (UniqueName: \"kubernetes.io/projected/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-kube-api-access-c2n7g\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:56 crc kubenswrapper[4837]: I1014 13:11:56.530278 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:11:57 crc kubenswrapper[4837]: I1014 13:11:57.023059 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw"] Oct 14 13:11:57 crc kubenswrapper[4837]: I1014 13:11:57.617006 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-6vd5d" podUID="fb47e83f-903a-4420-9741-645bbbdf63c4" containerName="console" containerID="cri-o://189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25" gracePeriod=15 Oct 14 13:11:57 crc kubenswrapper[4837]: I1014 13:11:57.666471 4837 generic.go:334] "Generic (PLEG): container finished" podID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerID="df24dc0be1bc12832068d676642438ff3b09078faaf0b99d741d78d6d676a0c8" exitCode=0 Oct 14 13:11:57 crc kubenswrapper[4837]: I1014 13:11:57.666526 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" event={"ID":"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7","Type":"ContainerDied","Data":"df24dc0be1bc12832068d676642438ff3b09078faaf0b99d741d78d6d676a0c8"} Oct 14 13:11:57 crc kubenswrapper[4837]: I1014 13:11:57.666559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" event={"ID":"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7","Type":"ContainerStarted","Data":"1252524acf5627a90e055ae6bfc83bdfe835c6a3d52f68b7460c0267406d7041"} Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.003520 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6vd5d_fb47e83f-903a-4420-9741-645bbbdf63c4/console/0.log" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.003589 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.097663 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-console-config\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.097819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-serving-cert\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.097921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz9rl\" (UniqueName: \"kubernetes.io/projected/fb47e83f-903a-4420-9741-645bbbdf63c4-kube-api-access-gz9rl\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.097959 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-service-ca\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.098017 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-trusted-ca-bundle\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.098069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-oauth-serving-cert\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.098131 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-oauth-config\") pod \"fb47e83f-903a-4420-9741-645bbbdf63c4\" (UID: \"fb47e83f-903a-4420-9741-645bbbdf63c4\") " Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.098814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.098962 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-console-config" (OuterVolumeSpecName: "console-config") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.099181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.099270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.114347 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.114533 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb47e83f-903a-4420-9741-645bbbdf63c4-kube-api-access-gz9rl" (OuterVolumeSpecName: "kube-api-access-gz9rl") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "kube-api-access-gz9rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.114832 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fb47e83f-903a-4420-9741-645bbbdf63c4" (UID: "fb47e83f-903a-4420-9741-645bbbdf63c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200129 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200218 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200239 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-console-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200295 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb47e83f-903a-4420-9741-645bbbdf63c4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200338 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz9rl\" (UniqueName: \"kubernetes.io/projected/fb47e83f-903a-4420-9741-645bbbdf63c4-kube-api-access-gz9rl\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200378 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-service-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.200402 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb47e83f-903a-4420-9741-645bbbdf63c4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.677428 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6vd5d_fb47e83f-903a-4420-9741-645bbbdf63c4/console/0.log" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.677517 4837 generic.go:334] "Generic (PLEG): container finished" podID="fb47e83f-903a-4420-9741-645bbbdf63c4" containerID="189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25" exitCode=2 Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.677571 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6vd5d" event={"ID":"fb47e83f-903a-4420-9741-645bbbdf63c4","Type":"ContainerDied","Data":"189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25"} Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.677623 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6vd5d" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.677731 4837 scope.go:117] "RemoveContainer" containerID="189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.677705 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6vd5d" event={"ID":"fb47e83f-903a-4420-9741-645bbbdf63c4","Type":"ContainerDied","Data":"5dbffabcf973cecbf3a2c02f5754462231b0db6182275f4a5f2123ade48d92eb"} Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.705766 4837 scope.go:117] "RemoveContainer" containerID="189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25" Oct 14 13:11:58 crc kubenswrapper[4837]: E1014 13:11:58.706429 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25\": container with ID starting with 189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25 not found: ID does not exist" containerID="189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.706483 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25"} err="failed to get container status \"189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25\": rpc error: code = NotFound desc = could not find container \"189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25\": container with ID starting with 189c0deff3760b651d75fea73d4303f6483272f910fd412e9d619f9fdd77fe25 not found: ID does not exist" Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.728828 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6vd5d"] Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.736244 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-6vd5d"] Oct 14 13:11:58 crc kubenswrapper[4837]: I1014 13:11:58.800786 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb47e83f-903a-4420-9741-645bbbdf63c4" path="/var/lib/kubelet/pods/fb47e83f-903a-4420-9741-645bbbdf63c4/volumes" Oct 14 13:11:59 crc kubenswrapper[4837]: I1014 13:11:59.691200 4837 generic.go:334] "Generic (PLEG): container finished" podID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerID="84687cc485c83265b9b87b263e8c3f71556c9937b2dc7bba1595b9fcb0319b55" exitCode=0 Oct 14 13:11:59 crc kubenswrapper[4837]: I1014 13:11:59.691271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" event={"ID":"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7","Type":"ContainerDied","Data":"84687cc485c83265b9b87b263e8c3f71556c9937b2dc7bba1595b9fcb0319b55"} Oct 14 13:12:00 crc kubenswrapper[4837]: I1014 13:12:00.700970 4837 generic.go:334] "Generic (PLEG): container finished" podID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerID="ddf45e989f96b4c7feb47be8b77b973ff87696461c92a58bd81980ace747cc85" exitCode=0 Oct 14 13:12:00 crc kubenswrapper[4837]: I1014 13:12:00.701010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" event={"ID":"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7","Type":"ContainerDied","Data":"ddf45e989f96b4c7feb47be8b77b973ff87696461c92a58bd81980ace747cc85"} Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.010594 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.048972 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-bundle\") pod \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.049040 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-util\") pod \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.049089 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2n7g\" (UniqueName: \"kubernetes.io/projected/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-kube-api-access-c2n7g\") pod \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\" (UID: \"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7\") " Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.050368 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-bundle" (OuterVolumeSpecName: "bundle") pod "f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" (UID: "f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.057767 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-kube-api-access-c2n7g" (OuterVolumeSpecName: "kube-api-access-c2n7g") pod "f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" (UID: "f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7"). InnerVolumeSpecName "kube-api-access-c2n7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.067871 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-util" (OuterVolumeSpecName: "util") pod "f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" (UID: "f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.150658 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.150706 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-util\") on node \"crc\" DevicePath \"\"" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.150724 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2n7g\" (UniqueName: \"kubernetes.io/projected/f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7-kube-api-access-c2n7g\") on node \"crc\" DevicePath \"\"" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.717744 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" event={"ID":"f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7","Type":"ContainerDied","Data":"1252524acf5627a90e055ae6bfc83bdfe835c6a3d52f68b7460c0267406d7041"} Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.717816 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1252524acf5627a90e055ae6bfc83bdfe835c6a3d52f68b7460c0267406d7041" Oct 14 13:12:02 crc kubenswrapper[4837]: I1014 13:12:02.717948 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.465186 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c"] Oct 14 13:12:11 crc kubenswrapper[4837]: E1014 13:12:11.465961 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="extract" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.465974 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="extract" Oct 14 13:12:11 crc kubenswrapper[4837]: E1014 13:12:11.465984 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="util" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.465990 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="util" Oct 14 13:12:11 crc kubenswrapper[4837]: E1014 13:12:11.465998 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="pull" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.466005 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="pull" Oct 14 13:12:11 crc kubenswrapper[4837]: E1014 13:12:11.466011 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb47e83f-903a-4420-9741-645bbbdf63c4" containerName="console" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.466017 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb47e83f-903a-4420-9741-645bbbdf63c4" containerName="console" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.466109 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb47e83f-903a-4420-9741-645bbbdf63c4" containerName="console" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.466136 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7" containerName="extract" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.466606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.468394 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.468632 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qgtbx" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.469138 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.469891 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.472325 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.478626 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c"] Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.590643 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c827589-1da4-40cd-967d-4144c014cee8-webhook-cert\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.590712 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c827589-1da4-40cd-967d-4144c014cee8-apiservice-cert\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.590755 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfwp\" (UniqueName: \"kubernetes.io/projected/0c827589-1da4-40cd-967d-4144c014cee8-kube-api-access-4kfwp\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.692109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c827589-1da4-40cd-967d-4144c014cee8-webhook-cert\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.692222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c827589-1da4-40cd-967d-4144c014cee8-apiservice-cert\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.692277 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfwp\" (UniqueName: \"kubernetes.io/projected/0c827589-1da4-40cd-967d-4144c014cee8-kube-api-access-4kfwp\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.698972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c827589-1da4-40cd-967d-4144c014cee8-webhook-cert\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.699104 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c827589-1da4-40cd-967d-4144c014cee8-apiservice-cert\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.701316 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2"] Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.702106 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.704522 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.704715 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.706038 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-ckmlg" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.720045 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2"] Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.726278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfwp\" (UniqueName: \"kubernetes.io/projected/0c827589-1da4-40cd-967d-4144c014cee8-kube-api-access-4kfwp\") pod \"metallb-operator-controller-manager-bb79b9dd7-l248c\" (UID: \"0c827589-1da4-40cd-967d-4144c014cee8\") " pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.781663 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.922789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-apiservice-cert\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.922878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5nx\" (UniqueName: \"kubernetes.io/projected/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-kube-api-access-tn5nx\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:11 crc kubenswrapper[4837]: I1014 13:12:11.923009 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-webhook-cert\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.028821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-apiservice-cert\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.028899 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5nx\" (UniqueName: \"kubernetes.io/projected/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-kube-api-access-tn5nx\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.028961 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-webhook-cert\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.033925 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-apiservice-cert\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.039879 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-webhook-cert\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.052916 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5nx\" (UniqueName: \"kubernetes.io/projected/58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5-kube-api-access-tn5nx\") pod \"metallb-operator-webhook-server-5788b958cf-vqdk2\" (UID: \"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5\") " pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.062427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.117908 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c"] Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.279150 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2"] Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.803310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" event={"ID":"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5","Type":"ContainerStarted","Data":"0adfff99e3fc2ca08c3a04e233472d5aeba3faacbdf412be2f195aebf1e48f31"} Oct 14 13:12:12 crc kubenswrapper[4837]: I1014 13:12:12.803360 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" event={"ID":"0c827589-1da4-40cd-967d-4144c014cee8","Type":"ContainerStarted","Data":"e4ca548e07f6ff936ed6defb58f22f3d3575fdacb7986d4a7a9a1de552450e26"} Oct 14 13:12:15 crc kubenswrapper[4837]: I1014 13:12:15.824529 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" event={"ID":"0c827589-1da4-40cd-967d-4144c014cee8","Type":"ContainerStarted","Data":"1347421bf52298ab11ba8a232cdf8faf435ef131c8cfc632bbec9a91ebd00699"} Oct 14 13:12:15 crc kubenswrapper[4837]: I1014 13:12:15.824935 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:15 crc kubenswrapper[4837]: I1014 13:12:15.847109 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" podStartSLOduration=2.062504775 podStartE2EDuration="4.847090634s" podCreationTimestamp="2025-10-14 13:12:11 +0000 UTC" firstStartedPulling="2025-10-14 13:12:12.154808609 +0000 UTC m=+670.071808422" lastFinishedPulling="2025-10-14 13:12:14.939394468 +0000 UTC m=+672.856394281" observedRunningTime="2025-10-14 13:12:15.845516212 +0000 UTC m=+673.762516025" watchObservedRunningTime="2025-10-14 13:12:15.847090634 +0000 UTC m=+673.764090447" Oct 14 13:12:17 crc kubenswrapper[4837]: I1014 13:12:17.843333 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" event={"ID":"58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5","Type":"ContainerStarted","Data":"3828fc43ac8e1be40c70e0c140b056182aa3ba92cadd57289d0d3fa56c926607"} Oct 14 13:12:17 crc kubenswrapper[4837]: I1014 13:12:17.843760 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:17 crc kubenswrapper[4837]: I1014 13:12:17.867999 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" podStartSLOduration=2.151839395 podStartE2EDuration="6.867975631s" podCreationTimestamp="2025-10-14 13:12:11 +0000 UTC" firstStartedPulling="2025-10-14 13:12:12.286882112 +0000 UTC m=+670.203881925" lastFinishedPulling="2025-10-14 13:12:17.003018348 +0000 UTC m=+674.920018161" observedRunningTime="2025-10-14 13:12:17.864294101 +0000 UTC m=+675.781293944" watchObservedRunningTime="2025-10-14 13:12:17.867975631 +0000 UTC m=+675.784975444" Oct 14 13:12:32 crc kubenswrapper[4837]: I1014 13:12:32.071520 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5788b958cf-vqdk2" Oct 14 13:12:41 crc kubenswrapper[4837]: I1014 13:12:41.140609 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:12:41 crc kubenswrapper[4837]: I1014 13:12:41.141039 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:12:51 crc kubenswrapper[4837]: I1014 13:12:51.783920 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-bb79b9dd7-l248c" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.574470 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cxg7q"] Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.578888 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.580363 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf"] Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.581592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.586443 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.586517 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.586587 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q6xt4" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.587007 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.592575 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf"] Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.663757 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tlqsk"] Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.664807 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.666484 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.669670 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.669670 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.670770 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qblll" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.675595 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-9c8q6"] Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.676502 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.678344 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.691262 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-9c8q6"] Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.697979 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-frr-conf\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8vx\" (UniqueName: \"kubernetes.io/projected/529d2022-65d4-49b1-801d-f14d900cfdf7-kube-api-access-2n8vx\") pod \"frr-k8s-webhook-server-64bf5d555-hvtxf\" (UID: \"529d2022-65d4-49b1-801d-f14d900cfdf7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698286 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5598bd38-632b-4225-8064-3352c0dac0de-metrics-certs\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698377 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lv5\" (UniqueName: \"kubernetes.io/projected/5598bd38-632b-4225-8064-3352c0dac0de-kube-api-access-q6lv5\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529d2022-65d4-49b1-801d-f14d900cfdf7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-hvtxf\" (UID: \"529d2022-65d4-49b1-801d-f14d900cfdf7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698532 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-reloader\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698633 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-frr-sockets\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698703 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-metrics\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.698773 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5598bd38-632b-4225-8064-3352c0dac0de-frr-startup\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-frr-conf\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800282 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8vx\" (UniqueName: \"kubernetes.io/projected/529d2022-65d4-49b1-801d-f14d900cfdf7-kube-api-access-2n8vx\") pod \"frr-k8s-webhook-server-64bf5d555-hvtxf\" (UID: \"529d2022-65d4-49b1-801d-f14d900cfdf7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5598bd38-632b-4225-8064-3352c0dac0de-metrics-certs\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldj5\" (UniqueName: \"kubernetes.io/projected/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-kube-api-access-qldj5\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800417 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lv5\" (UniqueName: \"kubernetes.io/projected/5598bd38-632b-4225-8064-3352c0dac0de-kube-api-access-q6lv5\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800442 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmnj\" (UniqueName: \"kubernetes.io/projected/f744e2d8-9bff-4348-8014-42a4a7a5cc20-kube-api-access-phmnj\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529d2022-65d4-49b1-801d-f14d900cfdf7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-hvtxf\" (UID: \"529d2022-65d4-49b1-801d-f14d900cfdf7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800497 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f744e2d8-9bff-4348-8014-42a4a7a5cc20-metrics-certs\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-reloader\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-frr-sockets\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800572 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-metrics\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800590 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f744e2d8-9bff-4348-8014-42a4a7a5cc20-cert\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800612 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5598bd38-632b-4225-8064-3352c0dac0de-frr-startup\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800639 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-metrics-certs\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800670 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-metallb-excludel2\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.800839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-frr-conf\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.801478 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-reloader\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.801698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-frr-sockets\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.801940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5598bd38-632b-4225-8064-3352c0dac0de-metrics\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.802235 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5598bd38-632b-4225-8064-3352c0dac0de-frr-startup\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.806429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/529d2022-65d4-49b1-801d-f14d900cfdf7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-hvtxf\" (UID: \"529d2022-65d4-49b1-801d-f14d900cfdf7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.810596 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5598bd38-632b-4225-8064-3352c0dac0de-metrics-certs\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.818866 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8vx\" (UniqueName: \"kubernetes.io/projected/529d2022-65d4-49b1-801d-f14d900cfdf7-kube-api-access-2n8vx\") pod \"frr-k8s-webhook-server-64bf5d555-hvtxf\" (UID: \"529d2022-65d4-49b1-801d-f14d900cfdf7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.826046 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lv5\" (UniqueName: \"kubernetes.io/projected/5598bd38-632b-4225-8064-3352c0dac0de-kube-api-access-q6lv5\") pod \"frr-k8s-cxg7q\" (UID: \"5598bd38-632b-4225-8064-3352c0dac0de\") " pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901620 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmnj\" (UniqueName: \"kubernetes.io/projected/f744e2d8-9bff-4348-8014-42a4a7a5cc20-kube-api-access-phmnj\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901716 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f744e2d8-9bff-4348-8014-42a4a7a5cc20-metrics-certs\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f744e2d8-9bff-4348-8014-42a4a7a5cc20-cert\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-metrics-certs\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901850 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-metallb-excludel2\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.901979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldj5\" (UniqueName: \"kubernetes.io/projected/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-kube-api-access-qldj5\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: E1014 13:12:52.902465 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 14 13:12:52 crc kubenswrapper[4837]: E1014 13:12:52.902533 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist podName:f5bb08ae-810b-4b13-a2aa-6ff68721a5a3 nodeName:}" failed. No retries permitted until 2025-10-14 13:12:53.402511017 +0000 UTC m=+711.319510840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist") pod "speaker-tlqsk" (UID: "f5bb08ae-810b-4b13-a2aa-6ff68721a5a3") : secret "metallb-memberlist" not found Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.903045 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-metallb-excludel2\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.905358 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f744e2d8-9bff-4348-8014-42a4a7a5cc20-cert\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.909679 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-metrics-certs\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.909946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f744e2d8-9bff-4348-8014-42a4a7a5cc20-metrics-certs\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.912224 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.920184 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.921182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldj5\" (UniqueName: \"kubernetes.io/projected/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-kube-api-access-qldj5\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.930126 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmnj\" (UniqueName: \"kubernetes.io/projected/f744e2d8-9bff-4348-8014-42a4a7a5cc20-kube-api-access-phmnj\") pod \"controller-68d546b9d8-9c8q6\" (UID: \"f744e2d8-9bff-4348-8014-42a4a7a5cc20\") " pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:52 crc kubenswrapper[4837]: I1014 13:12:52.988238 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:53 crc kubenswrapper[4837]: I1014 13:12:53.131032 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf"] Oct 14 13:12:53 crc kubenswrapper[4837]: I1014 13:12:53.408364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:53 crc kubenswrapper[4837]: E1014 13:12:53.408963 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 14 13:12:53 crc kubenswrapper[4837]: E1014 13:12:53.409029 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist podName:f5bb08ae-810b-4b13-a2aa-6ff68721a5a3 nodeName:}" failed. No retries permitted until 2025-10-14 13:12:54.409009011 +0000 UTC m=+712.326008834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist") pod "speaker-tlqsk" (UID: "f5bb08ae-810b-4b13-a2aa-6ff68721a5a3") : secret "metallb-memberlist" not found Oct 14 13:12:53 crc kubenswrapper[4837]: I1014 13:12:53.411145 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-9c8q6"] Oct 14 13:12:53 crc kubenswrapper[4837]: W1014 13:12:53.419337 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf744e2d8_9bff_4348_8014_42a4a7a5cc20.slice/crio-929d6591010ad30005015b5ba44c1bcb24990764e02045b62cc795fa501bf0ab WatchSource:0}: Error finding container 929d6591010ad30005015b5ba44c1bcb24990764e02045b62cc795fa501bf0ab: Status 404 returned error can't find the container with id 929d6591010ad30005015b5ba44c1bcb24990764e02045b62cc795fa501bf0ab Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.066728 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-9c8q6" event={"ID":"f744e2d8-9bff-4348-8014-42a4a7a5cc20","Type":"ContainerStarted","Data":"85d55854716283598274a2ff2d252753c3633fa858fc81dd7a6dbaee31afe991"} Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.066800 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-9c8q6" event={"ID":"f744e2d8-9bff-4348-8014-42a4a7a5cc20","Type":"ContainerStarted","Data":"9630a5cc69d5d9b2b296f1cfd18004d44d09e1a4c2c0ccc23b77be91c9842ee7"} Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.066814 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-9c8q6" event={"ID":"f744e2d8-9bff-4348-8014-42a4a7a5cc20","Type":"ContainerStarted","Data":"929d6591010ad30005015b5ba44c1bcb24990764e02045b62cc795fa501bf0ab"} Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.066959 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.067935 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"88a7fc7fab55cb45c33846bd39149438a8afa4baa56000975c7d4d866ee77f9b"} Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.069455 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" event={"ID":"529d2022-65d4-49b1-801d-f14d900cfdf7","Type":"ContainerStarted","Data":"c3b9d3551b81d9abf0f0e66147398c396887e81bcf1223863048c7211d0d94a7"} Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.092213 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-9c8q6" podStartSLOduration=2.09219094 podStartE2EDuration="2.09219094s" podCreationTimestamp="2025-10-14 13:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:54.091144442 +0000 UTC m=+712.008144265" watchObservedRunningTime="2025-10-14 13:12:54.09219094 +0000 UTC m=+712.009190763" Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.423561 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.434109 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5bb08ae-810b-4b13-a2aa-6ff68721a5a3-memberlist\") pod \"speaker-tlqsk\" (UID: \"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3\") " pod="metallb-system/speaker-tlqsk" Oct 14 13:12:54 crc kubenswrapper[4837]: I1014 13:12:54.479544 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tlqsk" Oct 14 13:12:54 crc kubenswrapper[4837]: W1014 13:12:54.503654 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5bb08ae_810b_4b13_a2aa_6ff68721a5a3.slice/crio-fcf2dd8b07d6d4cfc1f6d3a21eb23e8eeb80eb63a5f70b0e71d1caca168a8177 WatchSource:0}: Error finding container fcf2dd8b07d6d4cfc1f6d3a21eb23e8eeb80eb63a5f70b0e71d1caca168a8177: Status 404 returned error can't find the container with id fcf2dd8b07d6d4cfc1f6d3a21eb23e8eeb80eb63a5f70b0e71d1caca168a8177 Oct 14 13:12:55 crc kubenswrapper[4837]: I1014 13:12:55.086712 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tlqsk" event={"ID":"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3","Type":"ContainerStarted","Data":"30c8780e39f176592c1b780190393f571d18f9fbed6891c6ef33e05c5fe84d94"} Oct 14 13:12:55 crc kubenswrapper[4837]: I1014 13:12:55.086754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tlqsk" event={"ID":"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3","Type":"ContainerStarted","Data":"6d70fcaaa5fbddf9593eeea0a2434ca976c355662598f5f862971ad5d7b10cf9"} Oct 14 13:12:55 crc kubenswrapper[4837]: I1014 13:12:55.086763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tlqsk" event={"ID":"f5bb08ae-810b-4b13-a2aa-6ff68721a5a3","Type":"ContainerStarted","Data":"fcf2dd8b07d6d4cfc1f6d3a21eb23e8eeb80eb63a5f70b0e71d1caca168a8177"} Oct 14 13:12:55 crc kubenswrapper[4837]: I1014 13:12:55.086997 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tlqsk" Oct 14 13:12:55 crc kubenswrapper[4837]: I1014 13:12:55.121633 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tlqsk" podStartSLOduration=3.121614181 podStartE2EDuration="3.121614181s" podCreationTimestamp="2025-10-14 13:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:55.118802135 +0000 UTC m=+713.035801958" watchObservedRunningTime="2025-10-14 13:12:55.121614181 +0000 UTC m=+713.038614004" Oct 14 13:13:01 crc kubenswrapper[4837]: I1014 13:13:01.187634 4837 generic.go:334] "Generic (PLEG): container finished" podID="5598bd38-632b-4225-8064-3352c0dac0de" containerID="c29443233132752d4b2b056bbd7f7d70b435edff4715b75c3fedd3d334798bb0" exitCode=0 Oct 14 13:13:01 crc kubenswrapper[4837]: I1014 13:13:01.187734 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerDied","Data":"c29443233132752d4b2b056bbd7f7d70b435edff4715b75c3fedd3d334798bb0"} Oct 14 13:13:01 crc kubenswrapper[4837]: I1014 13:13:01.191607 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" event={"ID":"529d2022-65d4-49b1-801d-f14d900cfdf7","Type":"ContainerStarted","Data":"880e32e1e67d30ca7a5b6eaadc2eac48098898cf311cde9b9fca4926a1310832"} Oct 14 13:13:01 crc kubenswrapper[4837]: I1014 13:13:01.191841 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:13:01 crc kubenswrapper[4837]: I1014 13:13:01.265385 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" podStartSLOduration=2.1570666689999998 podStartE2EDuration="9.265362374s" podCreationTimestamp="2025-10-14 13:12:52 +0000 UTC" firstStartedPulling="2025-10-14 13:12:53.151069272 +0000 UTC m=+711.068069075" lastFinishedPulling="2025-10-14 13:13:00.259364957 +0000 UTC m=+718.176364780" observedRunningTime="2025-10-14 13:13:01.263953025 +0000 UTC m=+719.180952878" watchObservedRunningTime="2025-10-14 13:13:01.265362374 +0000 UTC m=+719.182362197" Oct 14 13:13:02 crc kubenswrapper[4837]: I1014 13:13:02.201538 4837 generic.go:334] "Generic (PLEG): container finished" podID="5598bd38-632b-4225-8064-3352c0dac0de" containerID="1cbdb83f13e2d1489f1e21b4aad1b0067fa56e2538173158a6d587d892afd9db" exitCode=0 Oct 14 13:13:02 crc kubenswrapper[4837]: I1014 13:13:02.201751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerDied","Data":"1cbdb83f13e2d1489f1e21b4aad1b0067fa56e2538173158a6d587d892afd9db"} Oct 14 13:13:03 crc kubenswrapper[4837]: I1014 13:13:03.208772 4837 generic.go:334] "Generic (PLEG): container finished" podID="5598bd38-632b-4225-8064-3352c0dac0de" containerID="5c43b15b42c174019ca33fdc0b24e5896aa051f5e57b70da2093f49dc7b795b3" exitCode=0 Oct 14 13:13:03 crc kubenswrapper[4837]: I1014 13:13:03.208838 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerDied","Data":"5c43b15b42c174019ca33fdc0b24e5896aa051f5e57b70da2093f49dc7b795b3"} Oct 14 13:13:04 crc kubenswrapper[4837]: I1014 13:13:04.226601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"38447f3ddf30f6f9305aab2a38a4d6b282d5caa646529f8be542e7af6bcd6b0a"} Oct 14 13:13:04 crc kubenswrapper[4837]: I1014 13:13:04.226860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"2bc6b34918c859903eeccb7be3b7aad53f8b474cb3a49148a28cf136abbcb874"} Oct 14 13:13:04 crc kubenswrapper[4837]: I1014 13:13:04.226869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"d086ad2058511865280c4286b5273dc566a83ee6a8bed5e44e2bd03e0c4c13af"} Oct 14 13:13:04 crc kubenswrapper[4837]: I1014 13:13:04.226880 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"e0873dacbbbc46b7dbf9be6cd0d554aea66ca0de57451816e2061f0f709b405b"} Oct 14 13:13:04 crc kubenswrapper[4837]: I1014 13:13:04.226888 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"f7ca1fbce39ba1bc93b7993942cfed49524d72a84475c94a194d6444b9cbc33c"} Oct 14 13:13:04 crc kubenswrapper[4837]: I1014 13:13:04.483215 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tlqsk" Oct 14 13:13:05 crc kubenswrapper[4837]: I1014 13:13:05.245601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cxg7q" event={"ID":"5598bd38-632b-4225-8064-3352c0dac0de","Type":"ContainerStarted","Data":"e2d57421202c2b62f7ee9e23ed959367c8c958b82803759eb37fe8d48289ef37"} Oct 14 13:13:05 crc kubenswrapper[4837]: I1014 13:13:05.245863 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:13:05 crc kubenswrapper[4837]: I1014 13:13:05.275877 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cxg7q" podStartSLOduration=6.059643343 podStartE2EDuration="13.275852839s" podCreationTimestamp="2025-10-14 13:12:52 +0000 UTC" firstStartedPulling="2025-10-14 13:12:53.06536202 +0000 UTC m=+710.982361833" lastFinishedPulling="2025-10-14 13:13:00.281571376 +0000 UTC m=+718.198571329" observedRunningTime="2025-10-14 13:13:05.272396325 +0000 UTC m=+723.189396138" watchObservedRunningTime="2025-10-14 13:13:05.275852839 +0000 UTC m=+723.192852652" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.476742 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9cw9z"] Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.477909 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.480578 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-g2pzx" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.480622 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.491324 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.499736 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9cw9z"] Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.534829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lv2k\" (UniqueName: \"kubernetes.io/projected/97d71aa8-0165-4b1e-a791-3a975d19dd5f-kube-api-access-2lv2k\") pod \"openstack-operator-index-9cw9z\" (UID: \"97d71aa8-0165-4b1e-a791-3a975d19dd5f\") " pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.636723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lv2k\" (UniqueName: \"kubernetes.io/projected/97d71aa8-0165-4b1e-a791-3a975d19dd5f-kube-api-access-2lv2k\") pod \"openstack-operator-index-9cw9z\" (UID: \"97d71aa8-0165-4b1e-a791-3a975d19dd5f\") " pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.658032 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lv2k\" (UniqueName: \"kubernetes.io/projected/97d71aa8-0165-4b1e-a791-3a975d19dd5f-kube-api-access-2lv2k\") pod \"openstack-operator-index-9cw9z\" (UID: \"97d71aa8-0165-4b1e-a791-3a975d19dd5f\") " pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.805473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.912900 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:13:07 crc kubenswrapper[4837]: I1014 13:13:07.997122 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:13:08 crc kubenswrapper[4837]: I1014 13:13:08.088426 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9cw9z"] Oct 14 13:13:08 crc kubenswrapper[4837]: I1014 13:13:08.265784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9cw9z" event={"ID":"97d71aa8-0165-4b1e-a791-3a975d19dd5f","Type":"ContainerStarted","Data":"d2564aebea3ce7d404fa0cdce16c407b40daf950cda37ac4a263faf6b885119b"} Oct 14 13:13:10 crc kubenswrapper[4837]: I1014 13:13:10.843457 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9cw9z"] Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.139852 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.139963 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.457255 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qq6lf"] Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.458461 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.465238 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qq6lf"] Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.525101 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csmp8\" (UniqueName: \"kubernetes.io/projected/aafb3bab-e32a-4523-8b72-b3131408a0be-kube-api-access-csmp8\") pod \"openstack-operator-index-qq6lf\" (UID: \"aafb3bab-e32a-4523-8b72-b3131408a0be\") " pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.626192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csmp8\" (UniqueName: \"kubernetes.io/projected/aafb3bab-e32a-4523-8b72-b3131408a0be-kube-api-access-csmp8\") pod \"openstack-operator-index-qq6lf\" (UID: \"aafb3bab-e32a-4523-8b72-b3131408a0be\") " pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.645707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csmp8\" (UniqueName: \"kubernetes.io/projected/aafb3bab-e32a-4523-8b72-b3131408a0be-kube-api-access-csmp8\") pod \"openstack-operator-index-qq6lf\" (UID: \"aafb3bab-e32a-4523-8b72-b3131408a0be\") " pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:11 crc kubenswrapper[4837]: I1014 13:13:11.795896 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:12 crc kubenswrapper[4837]: I1014 13:13:12.927591 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-hvtxf" Oct 14 13:13:12 crc kubenswrapper[4837]: I1014 13:13:12.992369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-9c8q6" Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.116141 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qq6lf"] Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.300301 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qq6lf" event={"ID":"aafb3bab-e32a-4523-8b72-b3131408a0be","Type":"ContainerStarted","Data":"fc65e36c57b697b540f2f982c45bd793e7b1f7c77e43a4909e74ef900e9c356c"} Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.302237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9cw9z" event={"ID":"97d71aa8-0165-4b1e-a791-3a975d19dd5f","Type":"ContainerStarted","Data":"f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23"} Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.302478 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9cw9z" podUID="97d71aa8-0165-4b1e-a791-3a975d19dd5f" containerName="registry-server" containerID="cri-o://f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23" gracePeriod=2 Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.333093 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9cw9z" podStartSLOduration=1.507996069 podStartE2EDuration="6.333061774s" podCreationTimestamp="2025-10-14 13:13:07 +0000 UTC" firstStartedPulling="2025-10-14 13:13:08.097785428 +0000 UTC m=+726.014785241" lastFinishedPulling="2025-10-14 13:13:12.922851133 +0000 UTC m=+730.839850946" observedRunningTime="2025-10-14 13:13:13.329442725 +0000 UTC m=+731.246442608" watchObservedRunningTime="2025-10-14 13:13:13.333061774 +0000 UTC m=+731.250061597" Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.749003 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.862570 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lv2k\" (UniqueName: \"kubernetes.io/projected/97d71aa8-0165-4b1e-a791-3a975d19dd5f-kube-api-access-2lv2k\") pod \"97d71aa8-0165-4b1e-a791-3a975d19dd5f\" (UID: \"97d71aa8-0165-4b1e-a791-3a975d19dd5f\") " Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.870945 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d71aa8-0165-4b1e-a791-3a975d19dd5f-kube-api-access-2lv2k" (OuterVolumeSpecName: "kube-api-access-2lv2k") pod "97d71aa8-0165-4b1e-a791-3a975d19dd5f" (UID: "97d71aa8-0165-4b1e-a791-3a975d19dd5f"). InnerVolumeSpecName "kube-api-access-2lv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:13 crc kubenswrapper[4837]: I1014 13:13:13.964261 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lv2k\" (UniqueName: \"kubernetes.io/projected/97d71aa8-0165-4b1e-a791-3a975d19dd5f-kube-api-access-2lv2k\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.313761 4837 generic.go:334] "Generic (PLEG): container finished" podID="97d71aa8-0165-4b1e-a791-3a975d19dd5f" containerID="f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23" exitCode=0 Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.313812 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9cw9z" Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.313819 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9cw9z" event={"ID":"97d71aa8-0165-4b1e-a791-3a975d19dd5f","Type":"ContainerDied","Data":"f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23"} Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.313896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9cw9z" event={"ID":"97d71aa8-0165-4b1e-a791-3a975d19dd5f","Type":"ContainerDied","Data":"d2564aebea3ce7d404fa0cdce16c407b40daf950cda37ac4a263faf6b885119b"} Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.313936 4837 scope.go:117] "RemoveContainer" containerID="f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23" Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.317361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qq6lf" event={"ID":"aafb3bab-e32a-4523-8b72-b3131408a0be","Type":"ContainerStarted","Data":"82980e16e3f66a70bbf918b0a31db66ee1786f66c7af1c68787d9ecd073b7473"} Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.359188 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qq6lf" podStartSLOduration=3.189020571 podStartE2EDuration="3.359134012s" podCreationTimestamp="2025-10-14 13:13:11 +0000 UTC" firstStartedPulling="2025-10-14 13:13:13.137721212 +0000 UTC m=+731.054721025" lastFinishedPulling="2025-10-14 13:13:13.307834613 +0000 UTC m=+731.224834466" observedRunningTime="2025-10-14 13:13:14.346414859 +0000 UTC m=+732.263414712" watchObservedRunningTime="2025-10-14 13:13:14.359134012 +0000 UTC m=+732.276133835" Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.366543 4837 scope.go:117] "RemoveContainer" containerID="f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23" Oct 14 13:13:14 crc kubenswrapper[4837]: E1014 13:13:14.366968 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23\": container with ID starting with f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23 not found: ID does not exist" containerID="f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23" Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.367019 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23"} err="failed to get container status \"f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23\": rpc error: code = NotFound desc = could not find container \"f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23\": container with ID starting with f5ba99bea5f0d10dbc82c4ff498b6f5f83388b1b9eb8d97dc8e466a068a78a23 not found: ID does not exist" Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.400741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9cw9z"] Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.409104 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9cw9z"] Oct 14 13:13:14 crc kubenswrapper[4837]: I1014 13:13:14.799751 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d71aa8-0165-4b1e-a791-3a975d19dd5f" path="/var/lib/kubelet/pods/97d71aa8-0165-4b1e-a791-3a975d19dd5f/volumes" Oct 14 13:13:21 crc kubenswrapper[4837]: I1014 13:13:21.796091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:21 crc kubenswrapper[4837]: I1014 13:13:21.796753 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:21 crc kubenswrapper[4837]: I1014 13:13:21.828648 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:22 crc kubenswrapper[4837]: I1014 13:13:22.411883 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qq6lf" Oct 14 13:13:22 crc kubenswrapper[4837]: I1014 13:13:22.917344 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cxg7q" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.370622 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dfwxh"] Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.371969 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerName="controller-manager" containerID="cri-o://2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8" gracePeriod=30 Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.471959 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw"] Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.472237 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" podUID="a284d0f7-a004-45c1-9eb6-a500afacf05b" containerName="route-controller-manager" containerID="cri-o://65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32" gracePeriod=30 Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.780560 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.863982 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d4cs\" (UniqueName: \"kubernetes.io/projected/e6c42468-5fc7-4a67-86d7-73c0f7589899-kube-api-access-5d4cs\") pod \"e6c42468-5fc7-4a67-86d7-73c0f7589899\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.864052 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-client-ca\") pod \"e6c42468-5fc7-4a67-86d7-73c0f7589899\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.864110 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-config\") pod \"e6c42468-5fc7-4a67-86d7-73c0f7589899\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.864133 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-proxy-ca-bundles\") pod \"e6c42468-5fc7-4a67-86d7-73c0f7589899\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.864180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c42468-5fc7-4a67-86d7-73c0f7589899-serving-cert\") pod \"e6c42468-5fc7-4a67-86d7-73c0f7589899\" (UID: \"e6c42468-5fc7-4a67-86d7-73c0f7589899\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.865612 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6c42468-5fc7-4a67-86d7-73c0f7589899" (UID: "e6c42468-5fc7-4a67-86d7-73c0f7589899"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.866269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-config" (OuterVolumeSpecName: "config") pod "e6c42468-5fc7-4a67-86d7-73c0f7589899" (UID: "e6c42468-5fc7-4a67-86d7-73c0f7589899"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.866843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e6c42468-5fc7-4a67-86d7-73c0f7589899" (UID: "e6c42468-5fc7-4a67-86d7-73c0f7589899"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.875638 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c42468-5fc7-4a67-86d7-73c0f7589899-kube-api-access-5d4cs" (OuterVolumeSpecName: "kube-api-access-5d4cs") pod "e6c42468-5fc7-4a67-86d7-73c0f7589899" (UID: "e6c42468-5fc7-4a67-86d7-73c0f7589899"). InnerVolumeSpecName "kube-api-access-5d4cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.875801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c42468-5fc7-4a67-86d7-73c0f7589899-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6c42468-5fc7-4a67-86d7-73c0f7589899" (UID: "e6c42468-5fc7-4a67-86d7-73c0f7589899"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.893614 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966088 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-config\") pod \"a284d0f7-a004-45c1-9eb6-a500afacf05b\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966196 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-875bw\" (UniqueName: \"kubernetes.io/projected/a284d0f7-a004-45c1-9eb6-a500afacf05b-kube-api-access-875bw\") pod \"a284d0f7-a004-45c1-9eb6-a500afacf05b\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966247 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a284d0f7-a004-45c1-9eb6-a500afacf05b-serving-cert\") pod \"a284d0f7-a004-45c1-9eb6-a500afacf05b\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966306 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-client-ca\") pod \"a284d0f7-a004-45c1-9eb6-a500afacf05b\" (UID: \"a284d0f7-a004-45c1-9eb6-a500afacf05b\") " Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966553 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d4cs\" (UniqueName: \"kubernetes.io/projected/e6c42468-5fc7-4a67-86d7-73c0f7589899-kube-api-access-5d4cs\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966575 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-client-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966588 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966600 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6c42468-5fc7-4a67-86d7-73c0f7589899-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.966612 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c42468-5fc7-4a67-86d7-73c0f7589899-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.967086 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a284d0f7-a004-45c1-9eb6-a500afacf05b" (UID: "a284d0f7-a004-45c1-9eb6-a500afacf05b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.967105 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-config" (OuterVolumeSpecName: "config") pod "a284d0f7-a004-45c1-9eb6-a500afacf05b" (UID: "a284d0f7-a004-45c1-9eb6-a500afacf05b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.969047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a284d0f7-a004-45c1-9eb6-a500afacf05b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a284d0f7-a004-45c1-9eb6-a500afacf05b" (UID: "a284d0f7-a004-45c1-9eb6-a500afacf05b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:13:27 crc kubenswrapper[4837]: I1014 13:13:27.969249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a284d0f7-a004-45c1-9eb6-a500afacf05b-kube-api-access-875bw" (OuterVolumeSpecName: "kube-api-access-875bw") pod "a284d0f7-a004-45c1-9eb6-a500afacf05b" (UID: "a284d0f7-a004-45c1-9eb6-a500afacf05b"). InnerVolumeSpecName "kube-api-access-875bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.068119 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a284d0f7-a004-45c1-9eb6-a500afacf05b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.068150 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.068173 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a284d0f7-a004-45c1-9eb6-a500afacf05b-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.068182 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-875bw\" (UniqueName: \"kubernetes.io/projected/a284d0f7-a004-45c1-9eb6-a500afacf05b-kube-api-access-875bw\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.129856 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76b79577f5-7cpzd"] Oct 14 13:13:28 crc kubenswrapper[4837]: E1014 13:13:28.130085 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d71aa8-0165-4b1e-a791-3a975d19dd5f" containerName="registry-server" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130096 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d71aa8-0165-4b1e-a791-3a975d19dd5f" containerName="registry-server" Oct 14 13:13:28 crc kubenswrapper[4837]: E1014 13:13:28.130112 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a284d0f7-a004-45c1-9eb6-a500afacf05b" containerName="route-controller-manager" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130119 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a284d0f7-a004-45c1-9eb6-a500afacf05b" containerName="route-controller-manager" Oct 14 13:13:28 crc kubenswrapper[4837]: E1014 13:13:28.130134 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerName="controller-manager" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130145 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerName="controller-manager" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130277 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a284d0f7-a004-45c1-9eb6-a500afacf05b" containerName="route-controller-manager" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130287 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerName="controller-manager" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130298 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d71aa8-0165-4b1e-a791-3a975d19dd5f" containerName="registry-server" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.130707 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.141277 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76b79577f5-7cpzd"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.147633 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.148300 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.158393 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.169469 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f57a7706-92e7-4307-b6a0-694c90288780-serving-cert\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.169519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghfvk\" (UniqueName: \"kubernetes.io/projected/f57a7706-92e7-4307-b6a0-694c90288780-kube-api-access-ghfvk\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.169548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-proxy-ca-bundles\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.169587 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-client-ca\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.169623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-config\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270002 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f57a7706-92e7-4307-b6a0-694c90288780-serving-cert\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270039 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghfvk\" (UniqueName: \"kubernetes.io/projected/f57a7706-92e7-4307-b6a0-694c90288780-kube-api-access-ghfvk\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270080 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-proxy-ca-bundles\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270106 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-client-ca\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-config\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cd7121d-696f-4b85-91f6-0caecaff69cc-serving-cert\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd7121d-696f-4b85-91f6-0caecaff69cc-config\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270217 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cd7121d-696f-4b85-91f6-0caecaff69cc-client-ca\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.270232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wbm\" (UniqueName: \"kubernetes.io/projected/9cd7121d-696f-4b85-91f6-0caecaff69cc-kube-api-access-k2wbm\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.271021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-client-ca\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.271462 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-proxy-ca-bundles\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.271761 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57a7706-92e7-4307-b6a0-694c90288780-config\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.275116 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f57a7706-92e7-4307-b6a0-694c90288780-serving-cert\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.284809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghfvk\" (UniqueName: \"kubernetes.io/projected/f57a7706-92e7-4307-b6a0-694c90288780-kube-api-access-ghfvk\") pod \"controller-manager-76b79577f5-7cpzd\" (UID: \"f57a7706-92e7-4307-b6a0-694c90288780\") " pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.371139 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd7121d-696f-4b85-91f6-0caecaff69cc-config\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.371202 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cd7121d-696f-4b85-91f6-0caecaff69cc-serving-cert\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.371226 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cd7121d-696f-4b85-91f6-0caecaff69cc-client-ca\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.371247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wbm\" (UniqueName: \"kubernetes.io/projected/9cd7121d-696f-4b85-91f6-0caecaff69cc-kube-api-access-k2wbm\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.372635 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cd7121d-696f-4b85-91f6-0caecaff69cc-client-ca\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.372934 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd7121d-696f-4b85-91f6-0caecaff69cc-config\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.374727 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cd7121d-696f-4b85-91f6-0caecaff69cc-serving-cert\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.392132 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wbm\" (UniqueName: \"kubernetes.io/projected/9cd7121d-696f-4b85-91f6-0caecaff69cc-kube-api-access-k2wbm\") pod \"route-controller-manager-7b85b49969-x7cmv\" (UID: \"9cd7121d-696f-4b85-91f6-0caecaff69cc\") " pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.412538 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6c42468-5fc7-4a67-86d7-73c0f7589899" containerID="2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8" exitCode=0 Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.412603 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" event={"ID":"e6c42468-5fc7-4a67-86d7-73c0f7589899","Type":"ContainerDied","Data":"2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8"} Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.412631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" event={"ID":"e6c42468-5fc7-4a67-86d7-73c0f7589899","Type":"ContainerDied","Data":"a204ba1b6f60676c587ce4b794a5e1d5f686c3ce7749a8c4c28ef15dc41aa9f0"} Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.412624 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dfwxh" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.412645 4837 scope.go:117] "RemoveContainer" containerID="2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.414220 4837 generic.go:334] "Generic (PLEG): container finished" podID="a284d0f7-a004-45c1-9eb6-a500afacf05b" containerID="65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32" exitCode=0 Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.414294 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.414287 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" event={"ID":"a284d0f7-a004-45c1-9eb6-a500afacf05b","Type":"ContainerDied","Data":"65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32"} Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.414379 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw" event={"ID":"a284d0f7-a004-45c1-9eb6-a500afacf05b","Type":"ContainerDied","Data":"4312bca648e55e0748617a41fa432695e4a04605a623fc63ad0818301085b871"} Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.436797 4837 scope.go:117] "RemoveContainer" containerID="2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8" Oct 14 13:13:28 crc kubenswrapper[4837]: E1014 13:13:28.437806 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8\": container with ID starting with 2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8 not found: ID does not exist" containerID="2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.437843 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8"} err="failed to get container status \"2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8\": rpc error: code = NotFound desc = could not find container \"2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8\": container with ID starting with 2ce0a3c4e88ee2ed87b344824188b32c776b22c013e6682c29d48454d8ab60d8 not found: ID does not exist" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.437869 4837 scope.go:117] "RemoveContainer" containerID="65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.446366 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.447572 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.455287 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vnpkw"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.461556 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dfwxh"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.463341 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.465763 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dfwxh"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.466968 4837 scope.go:117] "RemoveContainer" containerID="65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32" Oct 14 13:13:28 crc kubenswrapper[4837]: E1014 13:13:28.467451 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32\": container with ID starting with 65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32 not found: ID does not exist" containerID="65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.467498 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32"} err="failed to get container status \"65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32\": rpc error: code = NotFound desc = could not find container \"65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32\": container with ID starting with 65055c8b27889b1ffafa95fdd1f707db53c5a031c192564c325712cd84db4b32 not found: ID does not exist" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.796287 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a284d0f7-a004-45c1-9eb6-a500afacf05b" path="/var/lib/kubelet/pods/a284d0f7-a004-45c1-9eb6-a500afacf05b/volumes" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.797449 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c42468-5fc7-4a67-86d7-73c0f7589899" path="/var/lib/kubelet/pods/e6c42468-5fc7-4a67-86d7-73c0f7589899/volumes" Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.945074 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76b79577f5-7cpzd"] Oct 14 13:13:28 crc kubenswrapper[4837]: I1014 13:13:28.981147 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv"] Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.421229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" event={"ID":"f57a7706-92e7-4307-b6a0-694c90288780","Type":"ContainerStarted","Data":"a21b7cd531c1718462d54f34b00806f8d396a3b848ce516a40e9d1166c17bef3"} Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.421304 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.421322 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" event={"ID":"f57a7706-92e7-4307-b6a0-694c90288780","Type":"ContainerStarted","Data":"1946d866834914c09b67f81a166fed8fb09073a995760eef2f9903b77014364c"} Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.423701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" event={"ID":"9cd7121d-696f-4b85-91f6-0caecaff69cc","Type":"ContainerStarted","Data":"d4ea4bdfce3f744efddd7a324fd6ab173c553a58117ded27a054edbb4231cdcd"} Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.423743 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" event={"ID":"9cd7121d-696f-4b85-91f6-0caecaff69cc","Type":"ContainerStarted","Data":"8c1b62fa170193e94633fb8ae5126437f699f5841261c17b5a2f5121c4ecb66f"} Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.424091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.436141 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.452975 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76b79577f5-7cpzd" podStartSLOduration=1.4529597349999999 podStartE2EDuration="1.452959735s" podCreationTimestamp="2025-10-14 13:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:29.450673233 +0000 UTC m=+747.367673056" watchObservedRunningTime="2025-10-14 13:13:29.452959735 +0000 UTC m=+747.369959548" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.477355 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" podStartSLOduration=1.477336383 podStartE2EDuration="1.477336383s" podCreationTimestamp="2025-10-14 13:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:29.476710656 +0000 UTC m=+747.393710469" watchObservedRunningTime="2025-10-14 13:13:29.477336383 +0000 UTC m=+747.394336196" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.653082 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b85b49969-x7cmv" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.827582 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk"] Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.828540 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.831701 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nvwzh" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.842816 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk"] Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.890830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-bundle\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.890944 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzk7\" (UniqueName: \"kubernetes.io/projected/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-kube-api-access-4tzk7\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.891046 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-util\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.992649 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-bundle\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.992720 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzk7\" (UniqueName: \"kubernetes.io/projected/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-kube-api-access-4tzk7\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.992786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-util\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.993353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-util\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:29 crc kubenswrapper[4837]: I1014 13:13:29.993473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-bundle\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:30 crc kubenswrapper[4837]: I1014 13:13:30.011548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzk7\" (UniqueName: \"kubernetes.io/projected/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-kube-api-access-4tzk7\") pod \"2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:30 crc kubenswrapper[4837]: I1014 13:13:30.149673 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:30 crc kubenswrapper[4837]: I1014 13:13:30.564678 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk"] Oct 14 13:13:30 crc kubenswrapper[4837]: W1014 13:13:30.575044 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d53462f_684e_4f9b_91dc_c9b7e9edf8aa.slice/crio-a688247d447a6cb34317a044e495fcb10127f8a23d09a93158ea99c41a06f858 WatchSource:0}: Error finding container a688247d447a6cb34317a044e495fcb10127f8a23d09a93158ea99c41a06f858: Status 404 returned error can't find the container with id a688247d447a6cb34317a044e495fcb10127f8a23d09a93158ea99c41a06f858 Oct 14 13:13:31 crc kubenswrapper[4837]: I1014 13:13:31.442589 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerID="a8d31c9fbc0812fa4cce7facdaae9406ccdcfe1b8a44b78d67cde1407d055516" exitCode=0 Oct 14 13:13:31 crc kubenswrapper[4837]: I1014 13:13:31.442782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" event={"ID":"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa","Type":"ContainerDied","Data":"a8d31c9fbc0812fa4cce7facdaae9406ccdcfe1b8a44b78d67cde1407d055516"} Oct 14 13:13:31 crc kubenswrapper[4837]: I1014 13:13:31.443746 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" event={"ID":"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa","Type":"ContainerStarted","Data":"a688247d447a6cb34317a044e495fcb10127f8a23d09a93158ea99c41a06f858"} Oct 14 13:13:32 crc kubenswrapper[4837]: I1014 13:13:32.452111 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerID="95056830c22fec3b669ad635d577ce0bc8100d96a544552afb2a9e9e97af4f22" exitCode=0 Oct 14 13:13:32 crc kubenswrapper[4837]: I1014 13:13:32.452229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" event={"ID":"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa","Type":"ContainerDied","Data":"95056830c22fec3b669ad635d577ce0bc8100d96a544552afb2a9e9e97af4f22"} Oct 14 13:13:33 crc kubenswrapper[4837]: I1014 13:13:33.463562 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerID="982b81ad7c4c5c057258f580875e475722863a9b782837e93c0976ea9de9e842" exitCode=0 Oct 14 13:13:33 crc kubenswrapper[4837]: I1014 13:13:33.463637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" event={"ID":"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa","Type":"ContainerDied","Data":"982b81ad7c4c5c057258f580875e475722863a9b782837e93c0976ea9de9e842"} Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.779461 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.871967 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-util\") pod \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.872140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tzk7\" (UniqueName: \"kubernetes.io/projected/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-kube-api-access-4tzk7\") pod \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.873325 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-bundle\") pod \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\" (UID: \"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa\") " Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.874488 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-bundle" (OuterVolumeSpecName: "bundle") pod "6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" (UID: "6d53462f-684e-4f9b-91dc-c9b7e9edf8aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.878240 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-kube-api-access-4tzk7" (OuterVolumeSpecName: "kube-api-access-4tzk7") pod "6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" (UID: "6d53462f-684e-4f9b-91dc-c9b7e9edf8aa"). InnerVolumeSpecName "kube-api-access-4tzk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.901297 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-util" (OuterVolumeSpecName: "util") pod "6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" (UID: "6d53462f-684e-4f9b-91dc-c9b7e9edf8aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.975319 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tzk7\" (UniqueName: \"kubernetes.io/projected/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-kube-api-access-4tzk7\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.975348 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:34 crc kubenswrapper[4837]: I1014 13:13:34.975357 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d53462f-684e-4f9b-91dc-c9b7e9edf8aa-util\") on node \"crc\" DevicePath \"\"" Oct 14 13:13:35 crc kubenswrapper[4837]: I1014 13:13:35.477751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" event={"ID":"6d53462f-684e-4f9b-91dc-c9b7e9edf8aa","Type":"ContainerDied","Data":"a688247d447a6cb34317a044e495fcb10127f8a23d09a93158ea99c41a06f858"} Oct 14 13:13:35 crc kubenswrapper[4837]: I1014 13:13:35.477794 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a688247d447a6cb34317a044e495fcb10127f8a23d09a93158ea99c41a06f858" Oct 14 13:13:35 crc kubenswrapper[4837]: I1014 13:13:35.477878 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk" Oct 14 13:13:36 crc kubenswrapper[4837]: I1014 13:13:36.810866 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.140058 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.140455 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.140504 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.141117 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9c5da248ef4f304e8c83104496af5297a77f5eb3df38f2188353642fbfdb087"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.141206 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://a9c5da248ef4f304e8c83104496af5297a77f5eb3df38f2188353642fbfdb087" gracePeriod=600 Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.520262 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="a9c5da248ef4f304e8c83104496af5297a77f5eb3df38f2188353642fbfdb087" exitCode=0 Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.520303 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"a9c5da248ef4f304e8c83104496af5297a77f5eb3df38f2188353642fbfdb087"} Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.520614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"2f7061072f040d06169aa6c27c24b779e700a974c29cdb9d45439f3b10ea132d"} Oct 14 13:13:41 crc kubenswrapper[4837]: I1014 13:13:41.520639 4837 scope.go:117] "RemoveContainer" containerID="c9df650ea9a0889b5303a141ba1c69bbbdcc6bf28d1e1e51c58ad3e80e0c7622" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.418939 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95"] Oct 14 13:13:42 crc kubenswrapper[4837]: E1014 13:13:42.419497 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="util" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.419514 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="util" Oct 14 13:13:42 crc kubenswrapper[4837]: E1014 13:13:42.419525 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="pull" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.419534 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="pull" Oct 14 13:13:42 crc kubenswrapper[4837]: E1014 13:13:42.419545 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="extract" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.419553 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="extract" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.419666 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d53462f-684e-4f9b-91dc-c9b7e9edf8aa" containerName="extract" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.420329 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.424401 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-x6ddv" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.447031 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95"] Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.493436 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mdl\" (UniqueName: \"kubernetes.io/projected/6092738b-995d-48bd-a9a7-0c5b4caebea9-kube-api-access-n4mdl\") pod \"openstack-operator-controller-operator-6f9b497985-b8x95\" (UID: \"6092738b-995d-48bd-a9a7-0c5b4caebea9\") " pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.594435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mdl\" (UniqueName: \"kubernetes.io/projected/6092738b-995d-48bd-a9a7-0c5b4caebea9-kube-api-access-n4mdl\") pod \"openstack-operator-controller-operator-6f9b497985-b8x95\" (UID: \"6092738b-995d-48bd-a9a7-0c5b4caebea9\") " pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.617403 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mdl\" (UniqueName: \"kubernetes.io/projected/6092738b-995d-48bd-a9a7-0c5b4caebea9-kube-api-access-n4mdl\") pod \"openstack-operator-controller-operator-6f9b497985-b8x95\" (UID: \"6092738b-995d-48bd-a9a7-0c5b4caebea9\") " pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:42 crc kubenswrapper[4837]: I1014 13:13:42.741659 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:43 crc kubenswrapper[4837]: I1014 13:13:43.179037 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95"] Oct 14 13:13:43 crc kubenswrapper[4837]: W1014 13:13:43.184545 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6092738b_995d_48bd_a9a7_0c5b4caebea9.slice/crio-d2035b5a6ad19ee5841bb83b02430b0797923d258b1bbd43b0a80783d08f604e WatchSource:0}: Error finding container d2035b5a6ad19ee5841bb83b02430b0797923d258b1bbd43b0a80783d08f604e: Status 404 returned error can't find the container with id d2035b5a6ad19ee5841bb83b02430b0797923d258b1bbd43b0a80783d08f604e Oct 14 13:13:43 crc kubenswrapper[4837]: I1014 13:13:43.532296 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" event={"ID":"6092738b-995d-48bd-a9a7-0c5b4caebea9","Type":"ContainerStarted","Data":"d2035b5a6ad19ee5841bb83b02430b0797923d258b1bbd43b0a80783d08f604e"} Oct 14 13:13:47 crc kubenswrapper[4837]: I1014 13:13:47.577335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" event={"ID":"6092738b-995d-48bd-a9a7-0c5b4caebea9","Type":"ContainerStarted","Data":"80cab809d5fc27d52c2ae64c38927244caaf856f50b4da80761d3d42e516ed82"} Oct 14 13:13:49 crc kubenswrapper[4837]: I1014 13:13:49.591610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" event={"ID":"6092738b-995d-48bd-a9a7-0c5b4caebea9","Type":"ContainerStarted","Data":"f70d6f45845a33fb58279c61157fe7a57868b274f68fa3f85ccccd32eeef5a40"} Oct 14 13:13:49 crc kubenswrapper[4837]: I1014 13:13:49.591993 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:49 crc kubenswrapper[4837]: I1014 13:13:49.634499 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" podStartSLOduration=1.7391705530000001 podStartE2EDuration="7.63445911s" podCreationTimestamp="2025-10-14 13:13:42 +0000 UTC" firstStartedPulling="2025-10-14 13:13:43.190441905 +0000 UTC m=+761.107441718" lastFinishedPulling="2025-10-14 13:13:49.085730452 +0000 UTC m=+767.002730275" observedRunningTime="2025-10-14 13:13:49.629619489 +0000 UTC m=+767.546619332" watchObservedRunningTime="2025-10-14 13:13:49.63445911 +0000 UTC m=+767.551458923" Oct 14 13:13:52 crc kubenswrapper[4837]: I1014 13:13:52.745536 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6f9b497985-b8x95" Oct 14 13:13:58 crc kubenswrapper[4837]: I1014 13:13:58.951894 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d5l5r"] Oct 14 13:13:58 crc kubenswrapper[4837]: I1014 13:13:58.958319 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:58 crc kubenswrapper[4837]: I1014 13:13:58.963802 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5l5r"] Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.040282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mxf\" (UniqueName: \"kubernetes.io/projected/4022a976-f86f-494f-b3ce-aeeee83f7d58-kube-api-access-42mxf\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.040340 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-utilities\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.040378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-catalog-content\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.141548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42mxf\" (UniqueName: \"kubernetes.io/projected/4022a976-f86f-494f-b3ce-aeeee83f7d58-kube-api-access-42mxf\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.141599 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-utilities\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.141629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-catalog-content\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.142082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-utilities\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.142110 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-catalog-content\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.160170 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mxf\" (UniqueName: \"kubernetes.io/projected/4022a976-f86f-494f-b3ce-aeeee83f7d58-kube-api-access-42mxf\") pod \"redhat-operators-d5l5r\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.277348 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:13:59 crc kubenswrapper[4837]: I1014 13:13:59.703808 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5l5r"] Oct 14 13:13:59 crc kubenswrapper[4837]: W1014 13:13:59.717400 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4022a976_f86f_494f_b3ce_aeeee83f7d58.slice/crio-be354a528b7e2196c893ad11503a5c906e38040d693ed65a28f796ab6b03e0d2 WatchSource:0}: Error finding container be354a528b7e2196c893ad11503a5c906e38040d693ed65a28f796ab6b03e0d2: Status 404 returned error can't find the container with id be354a528b7e2196c893ad11503a5c906e38040d693ed65a28f796ab6b03e0d2 Oct 14 13:14:00 crc kubenswrapper[4837]: I1014 13:14:00.663113 4837 generic.go:334] "Generic (PLEG): container finished" podID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerID="182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054" exitCode=0 Oct 14 13:14:00 crc kubenswrapper[4837]: I1014 13:14:00.663228 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5l5r" event={"ID":"4022a976-f86f-494f-b3ce-aeeee83f7d58","Type":"ContainerDied","Data":"182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054"} Oct 14 13:14:00 crc kubenswrapper[4837]: I1014 13:14:00.663417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5l5r" event={"ID":"4022a976-f86f-494f-b3ce-aeeee83f7d58","Type":"ContainerStarted","Data":"be354a528b7e2196c893ad11503a5c906e38040d693ed65a28f796ab6b03e0d2"} Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.331804 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86zx9"] Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.332945 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.354598 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86zx9"] Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.396634 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-catalog-content\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.396703 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-utilities\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.396830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9nbd\" (UniqueName: \"kubernetes.io/projected/b2ab3bcb-2462-4932-915b-cf194230317c-kube-api-access-r9nbd\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.497926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-catalog-content\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.498006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-utilities\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.498054 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9nbd\" (UniqueName: \"kubernetes.io/projected/b2ab3bcb-2462-4932-915b-cf194230317c-kube-api-access-r9nbd\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.498817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-catalog-content\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.498822 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-utilities\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.525996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9nbd\" (UniqueName: \"kubernetes.io/projected/b2ab3bcb-2462-4932-915b-cf194230317c-kube-api-access-r9nbd\") pod \"community-operators-86zx9\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.676296 4837 generic.go:334] "Generic (PLEG): container finished" podID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerID="92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c" exitCode=0 Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.676368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5l5r" event={"ID":"4022a976-f86f-494f-b3ce-aeeee83f7d58","Type":"ContainerDied","Data":"92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c"} Oct 14 13:14:02 crc kubenswrapper[4837]: I1014 13:14:02.705728 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:03 crc kubenswrapper[4837]: I1014 13:14:03.189055 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86zx9"] Oct 14 13:14:03 crc kubenswrapper[4837]: W1014 13:14:03.201429 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ab3bcb_2462_4932_915b_cf194230317c.slice/crio-ff3ac4812c00617dc4b7e6f713049e0086baafc4b283e4a78e9fb8773f1e1e3f WatchSource:0}: Error finding container ff3ac4812c00617dc4b7e6f713049e0086baafc4b283e4a78e9fb8773f1e1e3f: Status 404 returned error can't find the container with id ff3ac4812c00617dc4b7e6f713049e0086baafc4b283e4a78e9fb8773f1e1e3f Oct 14 13:14:03 crc kubenswrapper[4837]: I1014 13:14:03.687016 4837 generic.go:334] "Generic (PLEG): container finished" podID="b2ab3bcb-2462-4932-915b-cf194230317c" containerID="2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277" exitCode=0 Oct 14 13:14:03 crc kubenswrapper[4837]: I1014 13:14:03.687085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerDied","Data":"2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277"} Oct 14 13:14:03 crc kubenswrapper[4837]: I1014 13:14:03.687463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerStarted","Data":"ff3ac4812c00617dc4b7e6f713049e0086baafc4b283e4a78e9fb8773f1e1e3f"} Oct 14 13:14:03 crc kubenswrapper[4837]: I1014 13:14:03.693649 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5l5r" event={"ID":"4022a976-f86f-494f-b3ce-aeeee83f7d58","Type":"ContainerStarted","Data":"3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3"} Oct 14 13:14:03 crc kubenswrapper[4837]: I1014 13:14:03.725537 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d5l5r" podStartSLOduration=3.157261316 podStartE2EDuration="5.725517021s" podCreationTimestamp="2025-10-14 13:13:58 +0000 UTC" firstStartedPulling="2025-10-14 13:14:00.666987886 +0000 UTC m=+778.583987719" lastFinishedPulling="2025-10-14 13:14:03.235243611 +0000 UTC m=+781.152243424" observedRunningTime="2025-10-14 13:14:03.724802012 +0000 UTC m=+781.641801825" watchObservedRunningTime="2025-10-14 13:14:03.725517021 +0000 UTC m=+781.642516854" Oct 14 13:14:04 crc kubenswrapper[4837]: I1014 13:14:04.701188 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerStarted","Data":"ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625"} Oct 14 13:14:05 crc kubenswrapper[4837]: I1014 13:14:05.712988 4837 generic.go:334] "Generic (PLEG): container finished" podID="b2ab3bcb-2462-4932-915b-cf194230317c" containerID="ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625" exitCode=0 Oct 14 13:14:05 crc kubenswrapper[4837]: I1014 13:14:05.713116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerDied","Data":"ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625"} Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.534503 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7vc2"] Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.535698 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.548441 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7vc2"] Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.661737 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlg6s\" (UniqueName: \"kubernetes.io/projected/234afcf6-e859-4b49-85cd-efc2319360d7-kube-api-access-nlg6s\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.661778 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-catalog-content\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.661814 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-utilities\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.720817 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerStarted","Data":"2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e"} Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.741786 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86zx9" podStartSLOduration=2.275242995 podStartE2EDuration="4.741747335s" podCreationTimestamp="2025-10-14 13:14:02 +0000 UTC" firstStartedPulling="2025-10-14 13:14:03.689345585 +0000 UTC m=+781.606345408" lastFinishedPulling="2025-10-14 13:14:06.155849935 +0000 UTC m=+784.072849748" observedRunningTime="2025-10-14 13:14:06.737474219 +0000 UTC m=+784.654474042" watchObservedRunningTime="2025-10-14 13:14:06.741747335 +0000 UTC m=+784.658747158" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.763734 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlg6s\" (UniqueName: \"kubernetes.io/projected/234afcf6-e859-4b49-85cd-efc2319360d7-kube-api-access-nlg6s\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.763790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-catalog-content\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.763852 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-utilities\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.764349 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-catalog-content\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.766097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-utilities\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.795885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlg6s\" (UniqueName: \"kubernetes.io/projected/234afcf6-e859-4b49-85cd-efc2319360d7-kube-api-access-nlg6s\") pod \"certified-operators-q7vc2\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:06 crc kubenswrapper[4837]: I1014 13:14:06.849335 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:07 crc kubenswrapper[4837]: I1014 13:14:07.374126 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7vc2"] Oct 14 13:14:07 crc kubenswrapper[4837]: I1014 13:14:07.728986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7vc2" event={"ID":"234afcf6-e859-4b49-85cd-efc2319360d7","Type":"ContainerStarted","Data":"cc3b4f513516c837111c7d0cdddfd179aa618c763f26075d5bedfd6eee27e472"} Oct 14 13:14:08 crc kubenswrapper[4837]: I1014 13:14:08.738769 4837 generic.go:334] "Generic (PLEG): container finished" podID="234afcf6-e859-4b49-85cd-efc2319360d7" containerID="c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb" exitCode=0 Oct 14 13:14:08 crc kubenswrapper[4837]: I1014 13:14:08.738824 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7vc2" event={"ID":"234afcf6-e859-4b49-85cd-efc2319360d7","Type":"ContainerDied","Data":"c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb"} Oct 14 13:14:09 crc kubenswrapper[4837]: I1014 13:14:09.278184 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:14:09 crc kubenswrapper[4837]: I1014 13:14:09.278524 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:14:09 crc kubenswrapper[4837]: I1014 13:14:09.326963 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:14:09 crc kubenswrapper[4837]: I1014 13:14:09.802372 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:14:10 crc kubenswrapper[4837]: I1014 13:14:10.763566 4837 generic.go:334] "Generic (PLEG): container finished" podID="234afcf6-e859-4b49-85cd-efc2319360d7" containerID="ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32" exitCode=0 Oct 14 13:14:10 crc kubenswrapper[4837]: I1014 13:14:10.763667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7vc2" event={"ID":"234afcf6-e859-4b49-85cd-efc2319360d7","Type":"ContainerDied","Data":"ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32"} Oct 14 13:14:11 crc kubenswrapper[4837]: I1014 13:14:11.773359 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7vc2" event={"ID":"234afcf6-e859-4b49-85cd-efc2319360d7","Type":"ContainerStarted","Data":"829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa"} Oct 14 13:14:11 crc kubenswrapper[4837]: I1014 13:14:11.800685 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7vc2" podStartSLOduration=3.259436636 podStartE2EDuration="5.800664212s" podCreationTimestamp="2025-10-14 13:14:06 +0000 UTC" firstStartedPulling="2025-10-14 13:14:08.741473668 +0000 UTC m=+786.658473481" lastFinishedPulling="2025-10-14 13:14:11.282701214 +0000 UTC m=+789.199701057" observedRunningTime="2025-10-14 13:14:11.794044413 +0000 UTC m=+789.711044236" watchObservedRunningTime="2025-10-14 13:14:11.800664212 +0000 UTC m=+789.717664035" Oct 14 13:14:12 crc kubenswrapper[4837]: I1014 13:14:12.707073 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:12 crc kubenswrapper[4837]: I1014 13:14:12.707205 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:12 crc kubenswrapper[4837]: I1014 13:14:12.725338 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d5l5r"] Oct 14 13:14:12 crc kubenswrapper[4837]: I1014 13:14:12.725591 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d5l5r" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="registry-server" containerID="cri-o://3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3" gracePeriod=2 Oct 14 13:14:12 crc kubenswrapper[4837]: I1014 13:14:12.757557 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:12 crc kubenswrapper[4837]: I1014 13:14:12.820631 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:13 crc kubenswrapper[4837]: E1014 13:14:13.006432 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4022a976_f86f_494f_b3ce_aeeee83f7d58.slice/crio-3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3.scope\": RecentStats: unable to find data in memory cache]" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.709743 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.790362 4837 generic.go:334] "Generic (PLEG): container finished" podID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerID="3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3" exitCode=0 Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.790516 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5l5r" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.790510 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5l5r" event={"ID":"4022a976-f86f-494f-b3ce-aeeee83f7d58","Type":"ContainerDied","Data":"3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3"} Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.790638 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5l5r" event={"ID":"4022a976-f86f-494f-b3ce-aeeee83f7d58","Type":"ContainerDied","Data":"be354a528b7e2196c893ad11503a5c906e38040d693ed65a28f796ab6b03e0d2"} Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.790682 4837 scope.go:117] "RemoveContainer" containerID="3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.810364 4837 scope.go:117] "RemoveContainer" containerID="92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.854463 4837 scope.go:117] "RemoveContainer" containerID="182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.874887 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42mxf\" (UniqueName: \"kubernetes.io/projected/4022a976-f86f-494f-b3ce-aeeee83f7d58-kube-api-access-42mxf\") pod \"4022a976-f86f-494f-b3ce-aeeee83f7d58\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.874977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-utilities\") pod \"4022a976-f86f-494f-b3ce-aeeee83f7d58\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.875032 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-catalog-content\") pod \"4022a976-f86f-494f-b3ce-aeeee83f7d58\" (UID: \"4022a976-f86f-494f-b3ce-aeeee83f7d58\") " Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.876033 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-utilities" (OuterVolumeSpecName: "utilities") pod "4022a976-f86f-494f-b3ce-aeeee83f7d58" (UID: "4022a976-f86f-494f-b3ce-aeeee83f7d58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.883304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4022a976-f86f-494f-b3ce-aeeee83f7d58-kube-api-access-42mxf" (OuterVolumeSpecName: "kube-api-access-42mxf") pod "4022a976-f86f-494f-b3ce-aeeee83f7d58" (UID: "4022a976-f86f-494f-b3ce-aeeee83f7d58"). InnerVolumeSpecName "kube-api-access-42mxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.893257 4837 scope.go:117] "RemoveContainer" containerID="3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3" Oct 14 13:14:13 crc kubenswrapper[4837]: E1014 13:14:13.894532 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3\": container with ID starting with 3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3 not found: ID does not exist" containerID="3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.894584 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3"} err="failed to get container status \"3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3\": rpc error: code = NotFound desc = could not find container \"3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3\": container with ID starting with 3aac01b11c1e0a91aac7cb72dc61ec65654343a0c3630d3989b44600b649bbc3 not found: ID does not exist" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.894629 4837 scope.go:117] "RemoveContainer" containerID="92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c" Oct 14 13:14:13 crc kubenswrapper[4837]: E1014 13:14:13.898527 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c\": container with ID starting with 92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c not found: ID does not exist" containerID="92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.898572 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c"} err="failed to get container status \"92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c\": rpc error: code = NotFound desc = could not find container \"92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c\": container with ID starting with 92b19e12c9d4db80918ec011275beba1bb48f30b190fae0c3d1d55e80764599c not found: ID does not exist" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.898593 4837 scope.go:117] "RemoveContainer" containerID="182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054" Oct 14 13:14:13 crc kubenswrapper[4837]: E1014 13:14:13.898916 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054\": container with ID starting with 182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054 not found: ID does not exist" containerID="182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.899028 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054"} err="failed to get container status \"182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054\": rpc error: code = NotFound desc = could not find container \"182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054\": container with ID starting with 182346fa158e4990d03562aecb1ca62640a517c89e9ad2970e1984e6fecb7054 not found: ID does not exist" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.967251 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4022a976-f86f-494f-b3ce-aeeee83f7d58" (UID: "4022a976-f86f-494f-b3ce-aeeee83f7d58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.976959 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.976997 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42mxf\" (UniqueName: \"kubernetes.io/projected/4022a976-f86f-494f-b3ce-aeeee83f7d58-kube-api-access-42mxf\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:13 crc kubenswrapper[4837]: I1014 13:14:13.977012 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4022a976-f86f-494f-b3ce-aeeee83f7d58-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:14 crc kubenswrapper[4837]: I1014 13:14:14.121841 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d5l5r"] Oct 14 13:14:14 crc kubenswrapper[4837]: I1014 13:14:14.125931 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d5l5r"] Oct 14 13:14:14 crc kubenswrapper[4837]: I1014 13:14:14.800138 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" path="/var/lib/kubelet/pods/4022a976-f86f-494f-b3ce-aeeee83f7d58/volumes" Oct 14 13:14:16 crc kubenswrapper[4837]: I1014 13:14:16.850174 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:16 crc kubenswrapper[4837]: I1014 13:14:16.851402 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:16 crc kubenswrapper[4837]: I1014 13:14:16.929287 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86zx9"] Oct 14 13:14:16 crc kubenswrapper[4837]: I1014 13:14:16.929490 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86zx9" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="registry-server" containerID="cri-o://2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e" gracePeriod=2 Oct 14 13:14:16 crc kubenswrapper[4837]: I1014 13:14:16.938806 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.570113 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.720801 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-utilities\") pod \"b2ab3bcb-2462-4932-915b-cf194230317c\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.721517 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9nbd\" (UniqueName: \"kubernetes.io/projected/b2ab3bcb-2462-4932-915b-cf194230317c-kube-api-access-r9nbd\") pod \"b2ab3bcb-2462-4932-915b-cf194230317c\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.721690 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-catalog-content\") pod \"b2ab3bcb-2462-4932-915b-cf194230317c\" (UID: \"b2ab3bcb-2462-4932-915b-cf194230317c\") " Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.722312 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-utilities" (OuterVolumeSpecName: "utilities") pod "b2ab3bcb-2462-4932-915b-cf194230317c" (UID: "b2ab3bcb-2462-4932-915b-cf194230317c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.730383 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ab3bcb-2462-4932-915b-cf194230317c-kube-api-access-r9nbd" (OuterVolumeSpecName: "kube-api-access-r9nbd") pod "b2ab3bcb-2462-4932-915b-cf194230317c" (UID: "b2ab3bcb-2462-4932-915b-cf194230317c"). InnerVolumeSpecName "kube-api-access-r9nbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.796946 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2ab3bcb-2462-4932-915b-cf194230317c" (UID: "b2ab3bcb-2462-4932-915b-cf194230317c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.824055 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9nbd\" (UniqueName: \"kubernetes.io/projected/b2ab3bcb-2462-4932-915b-cf194230317c-kube-api-access-r9nbd\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.824098 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.824116 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ab3bcb-2462-4932-915b-cf194230317c-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.830854 4837 generic.go:334] "Generic (PLEG): container finished" podID="b2ab3bcb-2462-4932-915b-cf194230317c" containerID="2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e" exitCode=0 Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.830948 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86zx9" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.831036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerDied","Data":"2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e"} Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.831073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86zx9" event={"ID":"b2ab3bcb-2462-4932-915b-cf194230317c","Type":"ContainerDied","Data":"ff3ac4812c00617dc4b7e6f713049e0086baafc4b283e4a78e9fb8773f1e1e3f"} Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.831093 4837 scope.go:117] "RemoveContainer" containerID="2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.860003 4837 scope.go:117] "RemoveContainer" containerID="ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.874121 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86zx9"] Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.881823 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86zx9"] Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.891927 4837 scope.go:117] "RemoveContainer" containerID="2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.905316 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.913808 4837 scope.go:117] "RemoveContainer" containerID="2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e" Oct 14 13:14:17 crc kubenswrapper[4837]: E1014 13:14:17.914359 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e\": container with ID starting with 2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e not found: ID does not exist" containerID="2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.914413 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e"} err="failed to get container status \"2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e\": rpc error: code = NotFound desc = could not find container \"2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e\": container with ID starting with 2e7d4d436c8dbb21a4806a78d2f6d98c933b955c0fd480d7a98a150f3b67e26e not found: ID does not exist" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.914450 4837 scope.go:117] "RemoveContainer" containerID="ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625" Oct 14 13:14:17 crc kubenswrapper[4837]: E1014 13:14:17.914754 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625\": container with ID starting with ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625 not found: ID does not exist" containerID="ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.914845 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625"} err="failed to get container status \"ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625\": rpc error: code = NotFound desc = could not find container \"ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625\": container with ID starting with ad8c542a2d816945f20e4b7b4f7430644637b82092918f904ef64c4003d06625 not found: ID does not exist" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.914917 4837 scope.go:117] "RemoveContainer" containerID="2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277" Oct 14 13:14:17 crc kubenswrapper[4837]: E1014 13:14:17.915286 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277\": container with ID starting with 2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277 not found: ID does not exist" containerID="2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277" Oct 14 13:14:17 crc kubenswrapper[4837]: I1014 13:14:17.915318 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277"} err="failed to get container status \"2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277\": rpc error: code = NotFound desc = could not find container \"2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277\": container with ID starting with 2206fec33417e25f1402e1b61a336105ca53442a01d0dc8d4ab26d9e8d29d277 not found: ID does not exist" Oct 14 13:14:18 crc kubenswrapper[4837]: I1014 13:14:18.799236 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" path="/var/lib/kubelet/pods/b2ab3bcb-2462-4932-915b-cf194230317c/volumes" Oct 14 13:14:19 crc kubenswrapper[4837]: I1014 13:14:19.330120 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7vc2"] Oct 14 13:14:20 crc kubenswrapper[4837]: I1014 13:14:20.850097 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7vc2" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="registry-server" containerID="cri-o://829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa" gracePeriod=2 Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.295534 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.481903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-catalog-content\") pod \"234afcf6-e859-4b49-85cd-efc2319360d7\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.482347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlg6s\" (UniqueName: \"kubernetes.io/projected/234afcf6-e859-4b49-85cd-efc2319360d7-kube-api-access-nlg6s\") pod \"234afcf6-e859-4b49-85cd-efc2319360d7\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.482399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-utilities\") pod \"234afcf6-e859-4b49-85cd-efc2319360d7\" (UID: \"234afcf6-e859-4b49-85cd-efc2319360d7\") " Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.483470 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-utilities" (OuterVolumeSpecName: "utilities") pod "234afcf6-e859-4b49-85cd-efc2319360d7" (UID: "234afcf6-e859-4b49-85cd-efc2319360d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.489028 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234afcf6-e859-4b49-85cd-efc2319360d7-kube-api-access-nlg6s" (OuterVolumeSpecName: "kube-api-access-nlg6s") pod "234afcf6-e859-4b49-85cd-efc2319360d7" (UID: "234afcf6-e859-4b49-85cd-efc2319360d7"). InnerVolumeSpecName "kube-api-access-nlg6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.533801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "234afcf6-e859-4b49-85cd-efc2319360d7" (UID: "234afcf6-e859-4b49-85cd-efc2319360d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.583755 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlg6s\" (UniqueName: \"kubernetes.io/projected/234afcf6-e859-4b49-85cd-efc2319360d7-kube-api-access-nlg6s\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.583797 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.583808 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/234afcf6-e859-4b49-85cd-efc2319360d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.859409 4837 generic.go:334] "Generic (PLEG): container finished" podID="234afcf6-e859-4b49-85cd-efc2319360d7" containerID="829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa" exitCode=0 Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.859454 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7vc2" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.859471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7vc2" event={"ID":"234afcf6-e859-4b49-85cd-efc2319360d7","Type":"ContainerDied","Data":"829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa"} Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.859525 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7vc2" event={"ID":"234afcf6-e859-4b49-85cd-efc2319360d7","Type":"ContainerDied","Data":"cc3b4f513516c837111c7d0cdddfd179aa618c763f26075d5bedfd6eee27e472"} Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.859594 4837 scope.go:117] "RemoveContainer" containerID="829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.877313 4837 scope.go:117] "RemoveContainer" containerID="ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.897364 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7vc2"] Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.903820 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7vc2"] Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.915961 4837 scope.go:117] "RemoveContainer" containerID="c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.948945 4837 scope.go:117] "RemoveContainer" containerID="829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa" Oct 14 13:14:21 crc kubenswrapper[4837]: E1014 13:14:21.949469 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa\": container with ID starting with 829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa not found: ID does not exist" containerID="829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.949516 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa"} err="failed to get container status \"829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa\": rpc error: code = NotFound desc = could not find container \"829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa\": container with ID starting with 829e3d3578ebda732bf675fb6395562af6db1adb059edb89389545922af1e7fa not found: ID does not exist" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.949545 4837 scope.go:117] "RemoveContainer" containerID="ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32" Oct 14 13:14:21 crc kubenswrapper[4837]: E1014 13:14:21.949915 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32\": container with ID starting with ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32 not found: ID does not exist" containerID="ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.949940 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32"} err="failed to get container status \"ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32\": rpc error: code = NotFound desc = could not find container \"ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32\": container with ID starting with ff75e6cfff4464462cf89948872bf6c05fc28ea718106a67dc9bdd7a4fd0fa32 not found: ID does not exist" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.949956 4837 scope.go:117] "RemoveContainer" containerID="c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb" Oct 14 13:14:21 crc kubenswrapper[4837]: E1014 13:14:21.950234 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb\": container with ID starting with c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb not found: ID does not exist" containerID="c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb" Oct 14 13:14:21 crc kubenswrapper[4837]: I1014 13:14:21.950261 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb"} err="failed to get container status \"c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb\": rpc error: code = NotFound desc = could not find container \"c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb\": container with ID starting with c01f6afaab4d340a4736c88520c62bcfc202bc33765a1535ef8ad85da4b847cb not found: ID does not exist" Oct 14 13:14:22 crc kubenswrapper[4837]: I1014 13:14:22.795123 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" path="/var/lib/kubelet/pods/234afcf6-e859-4b49-85cd-efc2319360d7/volumes" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529064 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfllp"] Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529828 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="extract-utilities" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529843 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="extract-utilities" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529854 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="extract-utilities" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529863 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="extract-utilities" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529877 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529885 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529898 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="extract-content" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529906 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="extract-content" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529917 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="extract-content" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529927 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="extract-content" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529941 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529949 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529959 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="extract-utilities" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529967 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="extract-utilities" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529979 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.529987 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: E1014 13:14:24.529999 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="extract-content" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.530007 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="extract-content" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.530129 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ab3bcb-2462-4932-915b-cf194230317c" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.530148 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4022a976-f86f-494f-b3ce-aeeee83f7d58" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.530183 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="234afcf6-e859-4b49-85cd-efc2319360d7" containerName="registry-server" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.531132 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.545225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfllp"] Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.624057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-utilities\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.624132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-catalog-content\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.624281 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9plk\" (UniqueName: \"kubernetes.io/projected/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-kube-api-access-d9plk\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.725282 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-utilities\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.725347 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-catalog-content\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.725427 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9plk\" (UniqueName: \"kubernetes.io/projected/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-kube-api-access-d9plk\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.726093 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-utilities\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.726421 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-catalog-content\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.760350 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9plk\" (UniqueName: \"kubernetes.io/projected/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-kube-api-access-d9plk\") pod \"redhat-marketplace-sfllp\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:24 crc kubenswrapper[4837]: I1014 13:14:24.850259 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:25 crc kubenswrapper[4837]: I1014 13:14:25.292642 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfllp"] Oct 14 13:14:25 crc kubenswrapper[4837]: I1014 13:14:25.884178 4837 generic.go:334] "Generic (PLEG): container finished" podID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerID="eaad4635ff1dd9778658e4b13b0e7c46af418c63991eb641aab054c4c15fdd7e" exitCode=0 Oct 14 13:14:25 crc kubenswrapper[4837]: I1014 13:14:25.884278 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfllp" event={"ID":"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d","Type":"ContainerDied","Data":"eaad4635ff1dd9778658e4b13b0e7c46af418c63991eb641aab054c4c15fdd7e"} Oct 14 13:14:25 crc kubenswrapper[4837]: I1014 13:14:25.884452 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfllp" event={"ID":"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d","Type":"ContainerStarted","Data":"faea06d47bd37f6cba14232695797f837cd4ad40ee07a2a63ce192bd76efb043"} Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.772599 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.774106 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.777657 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.779174 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.787050 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ccgl4" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.787322 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-h5djs" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.794889 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.796009 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.797712 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.798011 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5z8kr" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.807294 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.829629 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.841987 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.842867 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.847363 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4hsbl" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.851915 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.871802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.872733 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.873366 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.873737 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.883605 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vw2gq" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.885126 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.887142 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-795ck" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.903898 4837 generic.go:334] "Generic (PLEG): container finished" podID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerID="b33c8c495a4912bea2b2bc26f0214eaa462775a286a939850cf99dac94da8556" exitCode=0 Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.903935 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfllp" event={"ID":"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d","Type":"ContainerDied","Data":"b33c8c495a4912bea2b2bc26f0214eaa462775a286a939850cf99dac94da8556"} Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.905489 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.906680 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.910851 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lkmch" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.912858 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.918418 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.919499 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.921778 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.921863 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nt97p" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.922870 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.942459 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.950110 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-87zsz"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.958889 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.967033 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdx77\" (UniqueName: \"kubernetes.io/projected/f915ddfd-5160-4f57-85a8-9b5fe02c1908-kube-api-access-kdx77\") pod \"glance-operator-controller-manager-7bb46cd7d-q9dmc\" (UID: \"f915ddfd-5160-4f57-85a8-9b5fe02c1908\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.967109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdt8p\" (UniqueName: \"kubernetes.io/projected/cfce54d9-39e9-4b1f-bb95-11d72de2cbdc-kube-api-access-fdt8p\") pod \"heat-operator-controller-manager-6d9967f8dd-qkd6g\" (UID: \"cfce54d9-39e9-4b1f-bb95-11d72de2cbdc\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.967187 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg2qh\" (UniqueName: \"kubernetes.io/projected/53370b8e-db35-4a50-af38-f24ac2fad459-kube-api-access-dg2qh\") pod \"cinder-operator-controller-manager-59cdc64769-5f627\" (UID: \"53370b8e-db35-4a50-af38-f24ac2fad459\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.967244 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z822h\" (UniqueName: \"kubernetes.io/projected/e1d5f52e-4c67-4242-bea3-6eef9fb72623-kube-api-access-z822h\") pod \"barbican-operator-controller-manager-64f84fcdbb-49s6c\" (UID: \"e1d5f52e-4c67-4242-bea3-6eef9fb72623\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.967314 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfr9\" (UniqueName: \"kubernetes.io/projected/37e6419b-1647-43e2-89ef-67deae94e8b3-kube-api-access-stfr9\") pod \"designate-operator-controller-manager-687df44cdb-jczn9\" (UID: \"37e6419b-1647-43e2-89ef-67deae94e8b3\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.967966 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nwvwn" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.968678 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9"] Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.969589 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:26 crc kubenswrapper[4837]: I1014 13:14:26.973282 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zfwm7" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.015065 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.023013 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-87zsz"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.025284 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.057788 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.071607 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.075507 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.076274 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-k4rg7" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077129 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/62e66325-7f63-4815-9f2d-fafbd138fa4e-cert\") pod \"infra-operator-controller-manager-585fc5b659-7xv4c\" (UID: \"62e66325-7f63-4815-9f2d-fafbd138fa4e\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzf2x\" (UniqueName: \"kubernetes.io/projected/4f4fbd70-1ccf-4509-8552-ab902e8e7a0f-kube-api-access-bzf2x\") pod \"keystone-operator-controller-manager-ddb98f99b-fq8p9\" (UID: \"4f4fbd70-1ccf-4509-8552-ab902e8e7a0f\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077220 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfr9\" (UniqueName: \"kubernetes.io/projected/37e6419b-1647-43e2-89ef-67deae94e8b3-kube-api-access-stfr9\") pod \"designate-operator-controller-manager-687df44cdb-jczn9\" (UID: \"37e6419b-1647-43e2-89ef-67deae94e8b3\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077242 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbll\" (UniqueName: \"kubernetes.io/projected/32cb1840-83d3-40ec-859a-15391e369bde-kube-api-access-dwbll\") pod \"ironic-operator-controller-manager-74cb5cbc49-n6pcv\" (UID: \"32cb1840-83d3-40ec-859a-15391e369bde\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077265 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdx77\" (UniqueName: \"kubernetes.io/projected/f915ddfd-5160-4f57-85a8-9b5fe02c1908-kube-api-access-kdx77\") pod \"glance-operator-controller-manager-7bb46cd7d-q9dmc\" (UID: \"f915ddfd-5160-4f57-85a8-9b5fe02c1908\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdt8p\" (UniqueName: \"kubernetes.io/projected/cfce54d9-39e9-4b1f-bb95-11d72de2cbdc-kube-api-access-fdt8p\") pod \"heat-operator-controller-manager-6d9967f8dd-qkd6g\" (UID: \"cfce54d9-39e9-4b1f-bb95-11d72de2cbdc\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7gs\" (UniqueName: \"kubernetes.io/projected/f7815a82-8a77-47a1-8a07-966eb6340b2b-kube-api-access-2p7gs\") pod \"horizon-operator-controller-manager-6d74794d9b-9hctf\" (UID: \"f7815a82-8a77-47a1-8a07-966eb6340b2b\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg2qh\" (UniqueName: \"kubernetes.io/projected/53370b8e-db35-4a50-af38-f24ac2fad459-kube-api-access-dg2qh\") pod \"cinder-operator-controller-manager-59cdc64769-5f627\" (UID: \"53370b8e-db35-4a50-af38-f24ac2fad459\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077355 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z822h\" (UniqueName: \"kubernetes.io/projected/e1d5f52e-4c67-4242-bea3-6eef9fb72623-kube-api-access-z822h\") pod \"barbican-operator-controller-manager-64f84fcdbb-49s6c\" (UID: \"e1d5f52e-4c67-4242-bea3-6eef9fb72623\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84p5m\" (UniqueName: \"kubernetes.io/projected/62e66325-7f63-4815-9f2d-fafbd138fa4e-kube-api-access-84p5m\") pod \"infra-operator-controller-manager-585fc5b659-7xv4c\" (UID: \"62e66325-7f63-4815-9f2d-fafbd138fa4e\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.077407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsgs\" (UniqueName: \"kubernetes.io/projected/62be7f3d-ddbe-4470-ace0-0907330b09ac-kube-api-access-vqsgs\") pod \"manila-operator-controller-manager-59578bc799-87zsz\" (UID: \"62be7f3d-ddbe-4470-ace0-0907330b09ac\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.084260 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.087916 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wvtrr" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.103038 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.106957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg2qh\" (UniqueName: \"kubernetes.io/projected/53370b8e-db35-4a50-af38-f24ac2fad459-kube-api-access-dg2qh\") pod \"cinder-operator-controller-manager-59cdc64769-5f627\" (UID: \"53370b8e-db35-4a50-af38-f24ac2fad459\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.106966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z822h\" (UniqueName: \"kubernetes.io/projected/e1d5f52e-4c67-4242-bea3-6eef9fb72623-kube-api-access-z822h\") pod \"barbican-operator-controller-manager-64f84fcdbb-49s6c\" (UID: \"e1d5f52e-4c67-4242-bea3-6eef9fb72623\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.108526 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.108891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfr9\" (UniqueName: \"kubernetes.io/projected/37e6419b-1647-43e2-89ef-67deae94e8b3-kube-api-access-stfr9\") pod \"designate-operator-controller-manager-687df44cdb-jczn9\" (UID: \"37e6419b-1647-43e2-89ef-67deae94e8b3\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.109478 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdt8p\" (UniqueName: \"kubernetes.io/projected/cfce54d9-39e9-4b1f-bb95-11d72de2cbdc-kube-api-access-fdt8p\") pod \"heat-operator-controller-manager-6d9967f8dd-qkd6g\" (UID: \"cfce54d9-39e9-4b1f-bb95-11d72de2cbdc\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.111913 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.115415 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.115493 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.115663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdx77\" (UniqueName: \"kubernetes.io/projected/f915ddfd-5160-4f57-85a8-9b5fe02c1908-kube-api-access-kdx77\") pod \"glance-operator-controller-manager-7bb46cd7d-q9dmc\" (UID: \"f915ddfd-5160-4f57-85a8-9b5fe02c1908\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.116866 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.121176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4vw57" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.121440 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-pxrp2" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.124211 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.127218 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.132562 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.133535 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.135758 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.136094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5mj6q" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.138145 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.139237 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.140761 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.141303 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2dttw" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.142977 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.149727 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.153079 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.154233 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.156071 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-f4mdn" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.157969 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.159205 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.163227 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.163571 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bd6hz" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.166529 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.171337 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsgs\" (UniqueName: \"kubernetes.io/projected/62be7f3d-ddbe-4470-ace0-0907330b09ac-kube-api-access-vqsgs\") pod \"manila-operator-controller-manager-59578bc799-87zsz\" (UID: \"62be7f3d-ddbe-4470-ace0-0907330b09ac\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbnj\" (UniqueName: \"kubernetes.io/projected/7f62a453-6fb4-4769-a2ef-da03024d8e90-kube-api-access-jfbnj\") pod \"neutron-operator-controller-manager-797d478b46-b2hkp\" (UID: \"7f62a453-6fb4-4769-a2ef-da03024d8e90\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/62e66325-7f63-4815-9f2d-fafbd138fa4e-cert\") pod \"infra-operator-controller-manager-585fc5b659-7xv4c\" (UID: \"62e66325-7f63-4815-9f2d-fafbd138fa4e\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178699 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzf2x\" (UniqueName: \"kubernetes.io/projected/4f4fbd70-1ccf-4509-8552-ab902e8e7a0f-kube-api-access-bzf2x\") pod \"keystone-operator-controller-manager-ddb98f99b-fq8p9\" (UID: \"4f4fbd70-1ccf-4509-8552-ab902e8e7a0f\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wt4p\" (UniqueName: \"kubernetes.io/projected/e4f5b829-46e0-4048-9b51-1a9256375d4f-kube-api-access-7wt4p\") pod \"mariadb-operator-controller-manager-5777b4f897-mdzbv\" (UID: \"e4f5b829-46e0-4048-9b51-1a9256375d4f\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbll\" (UniqueName: \"kubernetes.io/projected/32cb1840-83d3-40ec-859a-15391e369bde-kube-api-access-dwbll\") pod \"ironic-operator-controller-manager-74cb5cbc49-n6pcv\" (UID: \"32cb1840-83d3-40ec-859a-15391e369bde\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178830 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7gs\" (UniqueName: \"kubernetes.io/projected/f7815a82-8a77-47a1-8a07-966eb6340b2b-kube-api-access-2p7gs\") pod \"horizon-operator-controller-manager-6d74794d9b-9hctf\" (UID: \"f7815a82-8a77-47a1-8a07-966eb6340b2b\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.178878 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84p5m\" (UniqueName: \"kubernetes.io/projected/62e66325-7f63-4815-9f2d-fafbd138fa4e-kube-api-access-84p5m\") pod \"infra-operator-controller-manager-585fc5b659-7xv4c\" (UID: \"62e66325-7f63-4815-9f2d-fafbd138fa4e\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.182895 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/62e66325-7f63-4815-9f2d-fafbd138fa4e-cert\") pod \"infra-operator-controller-manager-585fc5b659-7xv4c\" (UID: \"62e66325-7f63-4815-9f2d-fafbd138fa4e\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.200458 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzf2x\" (UniqueName: \"kubernetes.io/projected/4f4fbd70-1ccf-4509-8552-ab902e8e7a0f-kube-api-access-bzf2x\") pod \"keystone-operator-controller-manager-ddb98f99b-fq8p9\" (UID: \"4f4fbd70-1ccf-4509-8552-ab902e8e7a0f\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.201197 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.201559 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsgs\" (UniqueName: \"kubernetes.io/projected/62be7f3d-ddbe-4470-ace0-0907330b09ac-kube-api-access-vqsgs\") pod \"manila-operator-controller-manager-59578bc799-87zsz\" (UID: \"62be7f3d-ddbe-4470-ace0-0907330b09ac\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.201583 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7gs\" (UniqueName: \"kubernetes.io/projected/f7815a82-8a77-47a1-8a07-966eb6340b2b-kube-api-access-2p7gs\") pod \"horizon-operator-controller-manager-6d74794d9b-9hctf\" (UID: \"f7815a82-8a77-47a1-8a07-966eb6340b2b\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.204648 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84p5m\" (UniqueName: \"kubernetes.io/projected/62e66325-7f63-4815-9f2d-fafbd138fa4e-kube-api-access-84p5m\") pod \"infra-operator-controller-manager-585fc5b659-7xv4c\" (UID: \"62e66325-7f63-4815-9f2d-fafbd138fa4e\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.205353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbll\" (UniqueName: \"kubernetes.io/projected/32cb1840-83d3-40ec-859a-15391e369bde-kube-api-access-dwbll\") pod \"ironic-operator-controller-manager-74cb5cbc49-n6pcv\" (UID: \"32cb1840-83d3-40ec-859a-15391e369bde\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.213361 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.214737 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.215995 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lqrc2" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.219298 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.240988 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.249656 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.274004 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.276232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.278930 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-q6tsh" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279637 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5d2\" (UniqueName: \"kubernetes.io/projected/4d12bc33-de6d-405c-b539-72ab956b4234-kube-api-access-2j5d2\") pod \"nova-operator-controller-manager-57bb74c7bf-hx2m5\" (UID: \"4d12bc33-de6d-405c-b539-72ab956b4234\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbnj\" (UniqueName: \"kubernetes.io/projected/7f62a453-6fb4-4769-a2ef-da03024d8e90-kube-api-access-jfbnj\") pod \"neutron-operator-controller-manager-797d478b46-b2hkp\" (UID: \"7f62a453-6fb4-4769-a2ef-da03024d8e90\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279707 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvdfp\" (UniqueName: \"kubernetes.io/projected/1ac92ea3-d385-42f1-bc27-59a93f495cbc-kube-api-access-tvdfp\") pod \"ovn-operator-controller-manager-869cc7797f-mlhgx\" (UID: \"1ac92ea3-d385-42f1-bc27-59a93f495cbc\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279737 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wt4p\" (UniqueName: \"kubernetes.io/projected/e4f5b829-46e0-4048-9b51-1a9256375d4f-kube-api-access-7wt4p\") pod \"mariadb-operator-controller-manager-5777b4f897-mdzbv\" (UID: \"e4f5b829-46e0-4048-9b51-1a9256375d4f\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279759 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpngm\" (UniqueName: \"kubernetes.io/projected/922d6301-937e-403a-ade6-06620798c61c-kube-api-access-dpngm\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279783 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjlvb\" (UniqueName: \"kubernetes.io/projected/a086b7d2-5401-4754-9825-2425a3a2aa22-kube-api-access-cjlvb\") pod \"placement-operator-controller-manager-664664cb68-jkjjd\" (UID: \"a086b7d2-5401-4754-9825-2425a3a2aa22\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922d6301-937e-403a-ade6-06620798c61c-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279893 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsklx\" (UniqueName: \"kubernetes.io/projected/640618dc-c509-410b-9669-9b77a1f8d068-kube-api-access-jsklx\") pod \"swift-operator-controller-manager-5f4d5dfdc6-t8t4r\" (UID: \"640618dc-c509-410b-9669-9b77a1f8d068\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.279941 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zdr\" (UniqueName: \"kubernetes.io/projected/fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e-kube-api-access-t7zdr\") pod \"octavia-operator-controller-manager-6d7c7ddf95-6w7km\" (UID: \"fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.280732 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.283377 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.293090 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.304089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbnj\" (UniqueName: \"kubernetes.io/projected/7f62a453-6fb4-4769-a2ef-da03024d8e90-kube-api-access-jfbnj\") pod \"neutron-operator-controller-manager-797d478b46-b2hkp\" (UID: \"7f62a453-6fb4-4769-a2ef-da03024d8e90\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.305707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wt4p\" (UniqueName: \"kubernetes.io/projected/e4f5b829-46e0-4048-9b51-1a9256375d4f-kube-api-access-7wt4p\") pod \"mariadb-operator-controller-manager-5777b4f897-mdzbv\" (UID: \"e4f5b829-46e0-4048-9b51-1a9256375d4f\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.380129 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.383979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvdfp\" (UniqueName: \"kubernetes.io/projected/1ac92ea3-d385-42f1-bc27-59a93f495cbc-kube-api-access-tvdfp\") pod \"ovn-operator-controller-manager-869cc7797f-mlhgx\" (UID: \"1ac92ea3-d385-42f1-bc27-59a93f495cbc\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpngm\" (UniqueName: \"kubernetes.io/projected/922d6301-937e-403a-ade6-06620798c61c-kube-api-access-dpngm\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjlvb\" (UniqueName: \"kubernetes.io/projected/a086b7d2-5401-4754-9825-2425a3a2aa22-kube-api-access-cjlvb\") pod \"placement-operator-controller-manager-664664cb68-jkjjd\" (UID: \"a086b7d2-5401-4754-9825-2425a3a2aa22\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922d6301-937e-403a-ade6-06620798c61c-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384138 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxgw\" (UniqueName: \"kubernetes.io/projected/7d205182-3314-4282-800d-4dc57b64f416-kube-api-access-xwxgw\") pod \"telemetry-operator-controller-manager-578874c84d-78pkj\" (UID: \"7d205182-3314-4282-800d-4dc57b64f416\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384178 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsklx\" (UniqueName: \"kubernetes.io/projected/640618dc-c509-410b-9669-9b77a1f8d068-kube-api-access-jsklx\") pod \"swift-operator-controller-manager-5f4d5dfdc6-t8t4r\" (UID: \"640618dc-c509-410b-9669-9b77a1f8d068\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384203 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zdr\" (UniqueName: \"kubernetes.io/projected/fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e-kube-api-access-t7zdr\") pod \"octavia-operator-controller-manager-6d7c7ddf95-6w7km\" (UID: \"fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkq7g\" (UniqueName: \"kubernetes.io/projected/c2182e6f-c24c-4164-a269-4c11d34057a7-kube-api-access-pkq7g\") pod \"test-operator-controller-manager-ffcdd6c94-mhs2q\" (UID: \"c2182e6f-c24c-4164-a269-4c11d34057a7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.384252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5d2\" (UniqueName: \"kubernetes.io/projected/4d12bc33-de6d-405c-b539-72ab956b4234-kube-api-access-2j5d2\") pod \"nova-operator-controller-manager-57bb74c7bf-hx2m5\" (UID: \"4d12bc33-de6d-405c-b539-72ab956b4234\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:27 crc kubenswrapper[4837]: E1014 13:14:27.385232 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 14 13:14:27 crc kubenswrapper[4837]: E1014 13:14:27.385290 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/922d6301-937e-403a-ade6-06620798c61c-cert podName:922d6301-937e-403a-ade6-06620798c61c nodeName:}" failed. No retries permitted until 2025-10-14 13:14:27.885273937 +0000 UTC m=+805.802273750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/922d6301-937e-403a-ade6-06620798c61c-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" (UID: "922d6301-937e-403a-ade6-06620798c61c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.386122 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.386275 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.402837 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.403375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.403665 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rsdnk" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.409946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.412637 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5d2\" (UniqueName: \"kubernetes.io/projected/4d12bc33-de6d-405c-b539-72ab956b4234-kube-api-access-2j5d2\") pod \"nova-operator-controller-manager-57bb74c7bf-hx2m5\" (UID: \"4d12bc33-de6d-405c-b539-72ab956b4234\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.413010 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsklx\" (UniqueName: \"kubernetes.io/projected/640618dc-c509-410b-9669-9b77a1f8d068-kube-api-access-jsklx\") pod \"swift-operator-controller-manager-5f4d5dfdc6-t8t4r\" (UID: \"640618dc-c509-410b-9669-9b77a1f8d068\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.415785 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvdfp\" (UniqueName: \"kubernetes.io/projected/1ac92ea3-d385-42f1-bc27-59a93f495cbc-kube-api-access-tvdfp\") pod \"ovn-operator-controller-manager-869cc7797f-mlhgx\" (UID: \"1ac92ea3-d385-42f1-bc27-59a93f495cbc\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.417201 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zdr\" (UniqueName: \"kubernetes.io/projected/fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e-kube-api-access-t7zdr\") pod \"octavia-operator-controller-manager-6d7c7ddf95-6w7km\" (UID: \"fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.419914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjlvb\" (UniqueName: \"kubernetes.io/projected/a086b7d2-5401-4754-9825-2425a3a2aa22-kube-api-access-cjlvb\") pod \"placement-operator-controller-manager-664664cb68-jkjjd\" (UID: \"a086b7d2-5401-4754-9825-2425a3a2aa22\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.420123 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpngm\" (UniqueName: \"kubernetes.io/projected/922d6301-937e-403a-ade6-06620798c61c-kube-api-access-dpngm\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.487150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkq7g\" (UniqueName: \"kubernetes.io/projected/c2182e6f-c24c-4164-a269-4c11d34057a7-kube-api-access-pkq7g\") pod \"test-operator-controller-manager-ffcdd6c94-mhs2q\" (UID: \"c2182e6f-c24c-4164-a269-4c11d34057a7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.487288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxgw\" (UniqueName: \"kubernetes.io/projected/7d205182-3314-4282-800d-4dc57b64f416-kube-api-access-xwxgw\") pod \"telemetry-operator-controller-manager-578874c84d-78pkj\" (UID: \"7d205182-3314-4282-800d-4dc57b64f416\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.495137 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.505722 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.509079 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.523321 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.523542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.523708 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jlgjq" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.525792 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.526129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.532548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxgw\" (UniqueName: \"kubernetes.io/projected/7d205182-3314-4282-800d-4dc57b64f416-kube-api-access-xwxgw\") pod \"telemetry-operator-controller-manager-578874c84d-78pkj\" (UID: \"7d205182-3314-4282-800d-4dc57b64f416\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.534518 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkq7g\" (UniqueName: \"kubernetes.io/projected/c2182e6f-c24c-4164-a269-4c11d34057a7-kube-api-access-pkq7g\") pod \"test-operator-controller-manager-ffcdd6c94-mhs2q\" (UID: \"c2182e6f-c24c-4164-a269-4c11d34057a7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.560592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.568951 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.570112 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.574045 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.575439 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fdhg5" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.595291 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.610237 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbw4\" (UniqueName: \"kubernetes.io/projected/6017e7af-9d95-42c3-9f9c-bbd3df49f4f4-kube-api-access-nrbw4\") pod \"watcher-operator-controller-manager-646675d848-rfdvw\" (UID: \"6017e7af-9d95-42c3-9f9c-bbd3df49f4f4\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.610316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c7t4\" (UniqueName: \"kubernetes.io/projected/be28404e-866e-4ffd-8cfc-a43090217244-kube-api-access-2c7t4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs\" (UID: \"be28404e-866e-4ffd-8cfc-a43090217244\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.610493 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.617687 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.660685 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.711536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbw4\" (UniqueName: \"kubernetes.io/projected/6017e7af-9d95-42c3-9f9c-bbd3df49f4f4-kube-api-access-nrbw4\") pod \"watcher-operator-controller-manager-646675d848-rfdvw\" (UID: \"6017e7af-9d95-42c3-9f9c-bbd3df49f4f4\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.711631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c7t4\" (UniqueName: \"kubernetes.io/projected/be28404e-866e-4ffd-8cfc-a43090217244-kube-api-access-2c7t4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs\" (UID: \"be28404e-866e-4ffd-8cfc-a43090217244\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.711793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.711851 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpch2\" (UniqueName: \"kubernetes.io/projected/293d5905-c149-4fe1-a09d-204cc4cff4e6-kube-api-access-hpch2\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.761746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbw4\" (UniqueName: \"kubernetes.io/projected/6017e7af-9d95-42c3-9f9c-bbd3df49f4f4-kube-api-access-nrbw4\") pod \"watcher-operator-controller-manager-646675d848-rfdvw\" (UID: \"6017e7af-9d95-42c3-9f9c-bbd3df49f4f4\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.784566 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c7t4\" (UniqueName: \"kubernetes.io/projected/be28404e-866e-4ffd-8cfc-a43090217244-kube-api-access-2c7t4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs\" (UID: \"be28404e-866e-4ffd-8cfc-a43090217244\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.814083 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.814131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpch2\" (UniqueName: \"kubernetes.io/projected/293d5905-c149-4fe1-a09d-204cc4cff4e6-kube-api-access-hpch2\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:27 crc kubenswrapper[4837]: E1014 13:14:27.816790 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 14 13:14:27 crc kubenswrapper[4837]: E1014 13:14:27.816885 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert podName:293d5905-c149-4fe1-a09d-204cc4cff4e6 nodeName:}" failed. No retries permitted until 2025-10-14 13:14:28.316861903 +0000 UTC m=+806.233861716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert") pod "openstack-operator-controller-manager-84c49f8869-sxmsq" (UID: "293d5905-c149-4fe1-a09d-204cc4cff4e6") : secret "webhook-server-cert" not found Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.834531 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpch2\" (UniqueName: \"kubernetes.io/projected/293d5905-c149-4fe1-a09d-204cc4cff4e6-kube-api-access-hpch2\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.878351 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc"] Oct 14 13:14:27 crc kubenswrapper[4837]: W1014 13:14:27.910457 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf915ddfd_5160_4f57_85a8_9b5fe02c1908.slice/crio-b9249270b43bdf51f9b5067a0a848ad2c979424b24c2c36fe4563e50e3668b5f WatchSource:0}: Error finding container b9249270b43bdf51f9b5067a0a848ad2c979424b24c2c36fe4563e50e3668b5f: Status 404 returned error can't find the container with id b9249270b43bdf51f9b5067a0a848ad2c979424b24c2c36fe4563e50e3668b5f Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.914210 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.918619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922d6301-937e-403a-ade6-06620798c61c-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.923243 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/922d6301-937e-403a-ade6-06620798c61c-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b\" (UID: \"922d6301-937e-403a-ade6-06620798c61c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.945379 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9"] Oct 14 13:14:27 crc kubenswrapper[4837]: I1014 13:14:27.972932 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627"] Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.004230 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e6419b_1647_43e2_89ef_67deae94e8b3.slice/crio-42c2a7fa6e966cd52780db4236f509e2147e6bdadaf8110ad45d34594e31e56b WatchSource:0}: Error finding container 42c2a7fa6e966cd52780db4236f509e2147e6bdadaf8110ad45d34594e31e56b: Status 404 returned error can't find the container with id 42c2a7fa6e966cd52780db4236f509e2147e6bdadaf8110ad45d34594e31e56b Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.026326 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.134512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.325054 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:28 crc kubenswrapper[4837]: E1014 13:14:28.325192 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 14 13:14:28 crc kubenswrapper[4837]: E1014 13:14:28.325244 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert podName:293d5905-c149-4fe1-a09d-204cc4cff4e6 nodeName:}" failed. No retries permitted until 2025-10-14 13:14:29.325229781 +0000 UTC m=+807.242229594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert") pod "openstack-operator-controller-manager-84c49f8869-sxmsq" (UID: "293d5905-c149-4fe1-a09d-204cc4cff4e6") : secret "webhook-server-cert" not found Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.344010 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g"] Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.352471 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfce54d9_39e9_4b1f_bb95_11d72de2cbdc.slice/crio-1c0b847ace436274f7bda0d57b9d4a8766377b6636c33126930ec5907e32caff WatchSource:0}: Error finding container 1c0b847ace436274f7bda0d57b9d4a8766377b6636c33126930ec5907e32caff: Status 404 returned error can't find the container with id 1c0b847ace436274f7bda0d57b9d4a8766377b6636c33126930ec5907e32caff Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.363281 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.724668 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.742362 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp"] Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.749438 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e66325_7f63_4815_9f2d_fafbd138fa4e.slice/crio-a106d465763c6baf457f1517d9e1231084c1bd1b70cb86290747d8ad8042bf69 WatchSource:0}: Error finding container a106d465763c6baf457f1517d9e1231084c1bd1b70cb86290747d8ad8042bf69: Status 404 returned error can't find the container with id a106d465763c6baf457f1517d9e1231084c1bd1b70cb86290747d8ad8042bf69 Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.753928 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.769795 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-87zsz"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.788826 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r"] Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.845916 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3acb5d_8e6d_4c7a_9f6d_e59e87d6213e.slice/crio-b260dd3ee1848e646661f74ec39c1f692c3e4b38539f5db63c1ec1cb73488a31 WatchSource:0}: Error finding container b260dd3ee1848e646661f74ec39c1f692c3e4b38539f5db63c1ec1cb73488a31: Status 404 returned error can't find the container with id b260dd3ee1848e646661f74ec39c1f692c3e4b38539f5db63c1ec1cb73488a31 Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.864667 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d12bc33_de6d_405c_b539_72ab956b4234.slice/crio-f2e44774c0d6212a5f7781fc8c2da118627b643a8d327d05558072921497ea94 WatchSource:0}: Error finding container f2e44774c0d6212a5f7781fc8c2da118627b643a8d327d05558072921497ea94: Status 404 returned error can't find the container with id f2e44774c0d6212a5f7781fc8c2da118627b643a8d327d05558072921497ea94 Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.866103 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.866277 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.866289 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.877180 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.903459 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q"] Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.912704 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d205182_3314_4282_800d_4dc57b64f416.slice/crio-d41b496e0b14fac584c4eb83923dbb2b63fa0c8e1800e9151592a66697d3b0ec WatchSource:0}: Error finding container d41b496e0b14fac584c4eb83923dbb2b63fa0c8e1800e9151592a66697d3b0ec: Status 404 returned error can't find the container with id d41b496e0b14fac584c4eb83923dbb2b63fa0c8e1800e9151592a66697d3b0ec Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.931092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd"] Oct 14 13:14:28 crc kubenswrapper[4837]: E1014 13:14:28.936272 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pkq7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-mhs2q_openstack-operators(c2182e6f-c24c-4164-a269-4c11d34057a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.937594 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.951360 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv"] Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.954941 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf"] Oct 14 13:14:28 crc kubenswrapper[4837]: W1014 13:14:28.962288 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda086b7d2_5401_4754_9825_2425a3a2aa22.slice/crio-0ae40d5a751656b987a536c5a365277f8dad26f024be32b16c1f33f161f11d62 WatchSource:0}: Error finding container 0ae40d5a751656b987a536c5a365277f8dad26f024be32b16c1f33f161f11d62: Status 404 returned error can't find the container with id 0ae40d5a751656b987a536c5a365277f8dad26f024be32b16c1f33f161f11d62 Oct 14 13:14:28 crc kubenswrapper[4837]: I1014 13:14:28.969519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" event={"ID":"37e6419b-1647-43e2-89ef-67deae94e8b3","Type":"ContainerStarted","Data":"42c2a7fa6e966cd52780db4236f509e2147e6bdadaf8110ad45d34594e31e56b"} Oct 14 13:14:28 crc kubenswrapper[4837]: E1014 13:14:28.970013 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cjlvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-664664cb68-jkjjd_openstack-operators(a086b7d2-5401-4754-9825-2425a3a2aa22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:28.999494 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:47278ed28e02df00892f941763aa0d69547327318e8a983e07f4577acd288167,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7wt4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5777b4f897-mdzbv_openstack-operators(e4f5b829-46e0-4048-9b51-1a9256375d4f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:28.999664 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" event={"ID":"fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e","Type":"ContainerStarted","Data":"b260dd3ee1848e646661f74ec39c1f692c3e4b38539f5db63c1ec1cb73488a31"} Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.002839 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:063a7e65b4ba98f0506f269ff7525b446eae06a5ed4a61c18ffa33a886500867,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2p7gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6d74794d9b-9hctf_openstack-operators(f7815a82-8a77-47a1-8a07-966eb6340b2b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.002972 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" event={"ID":"f915ddfd-5160-4f57-85a8-9b5fe02c1908","Type":"ContainerStarted","Data":"b9249270b43bdf51f9b5067a0a848ad2c979424b24c2c36fe4563e50e3668b5f"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.004825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" event={"ID":"4f4fbd70-1ccf-4509-8552-ab902e8e7a0f","Type":"ContainerStarted","Data":"8c89480b3f7b2404cb4f3f0d1eadf774e4031d5a8583231cae38e8184a4ccbf3"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.007522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfllp" event={"ID":"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d","Type":"ContainerStarted","Data":"8525c639f40550529fb4e88435c480dfa4d7d1c638e64039a4c12887f0870806"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.009134 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" event={"ID":"cfce54d9-39e9-4b1f-bb95-11d72de2cbdc","Type":"ContainerStarted","Data":"1c0b847ace436274f7bda0d57b9d4a8766377b6636c33126930ec5907e32caff"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.028034 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" event={"ID":"7f62a453-6fb4-4769-a2ef-da03024d8e90","Type":"ContainerStarted","Data":"ca999c5bbfa9fd98e8fde3e65ae31ea1a642ca9ceb57655dfd5b4b050439cc1c"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.034109 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfllp" podStartSLOduration=3.395057649 podStartE2EDuration="5.034060849s" podCreationTimestamp="2025-10-14 13:14:24 +0000 UTC" firstStartedPulling="2025-10-14 13:14:25.885419062 +0000 UTC m=+803.802418885" lastFinishedPulling="2025-10-14 13:14:27.524422272 +0000 UTC m=+805.441422085" observedRunningTime="2025-10-14 13:14:29.030970066 +0000 UTC m=+806.947969879" watchObservedRunningTime="2025-10-14 13:14:29.034060849 +0000 UTC m=+806.951060662" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.042789 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" event={"ID":"1ac92ea3-d385-42f1-bc27-59a93f495cbc","Type":"ContainerStarted","Data":"6c448e845e7e170877cb3c70d0004f25c5fd991e9d55bf0185cff3b70b1fe6e6"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.046575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" event={"ID":"7d205182-3314-4282-800d-4dc57b64f416","Type":"ContainerStarted","Data":"d41b496e0b14fac584c4eb83923dbb2b63fa0c8e1800e9151592a66697d3b0ec"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.059413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" event={"ID":"4d12bc33-de6d-405c-b539-72ab956b4234","Type":"ContainerStarted","Data":"f2e44774c0d6212a5f7781fc8c2da118627b643a8d327d05558072921497ea94"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.064594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" event={"ID":"62be7f3d-ddbe-4470-ace0-0907330b09ac","Type":"ContainerStarted","Data":"635e6d7c94b317d376c742752cacf20c6704a6edf14d7c51c6e34323fd31b72f"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.081129 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" event={"ID":"53370b8e-db35-4a50-af38-f24ac2fad459","Type":"ContainerStarted","Data":"a68aa850edf4244a8128e370f5deaf82e30c205850bf0ba781db52629eec9bda"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.083476 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" event={"ID":"640618dc-c509-410b-9669-9b77a1f8d068","Type":"ContainerStarted","Data":"d367750cb3dd70c10679937f30f480f7ee6976b3475014468d8628e80e8e29d2"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.089651 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" event={"ID":"32cb1840-83d3-40ec-859a-15391e369bde","Type":"ContainerStarted","Data":"28295cc629bbc7e2ab2e20c6b1b4fa3b4e4688d45756c7f714fbc04c8cd96a03"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.092826 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" event={"ID":"62e66325-7f63-4815-9f2d-fafbd138fa4e","Type":"ContainerStarted","Data":"a106d465763c6baf457f1517d9e1231084c1bd1b70cb86290747d8ad8042bf69"} Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.111367 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs"] Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.117763 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw"] Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.121508 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b"] Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.122641 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2c7t4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs_openstack-operators(be28404e-866e-4ffd-8cfc-a43090217244): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.125379 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" podUID="be28404e-866e-4ffd-8cfc-a43090217244" Oct 14 13:14:29 crc kubenswrapper[4837]: W1014 13:14:29.153876 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod922d6301_937e_403a_ade6_06620798c61c.slice/crio-8e9aa792b8f6487c5ad4f7bcaa4b20d31bd707b5b7bff78a906da12c177e3c58 WatchSource:0}: Error finding container 8e9aa792b8f6487c5ad4f7bcaa4b20d31bd707b5b7bff78a906da12c177e3c58: Status 404 returned error can't find the container with id 8e9aa792b8f6487c5ad4f7bcaa4b20d31bd707b5b7bff78a906da12c177e3c58 Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.154926 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" podUID="c2182e6f-c24c-4164-a269-4c11d34057a7" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.163394 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpngm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b_openstack-operators(922d6301-937e-403a-ade6-06620798c61c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.170908 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" podUID="f7815a82-8a77-47a1-8a07-966eb6340b2b" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.174298 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" podUID="a086b7d2-5401-4754-9825-2425a3a2aa22" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.185064 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" podUID="e4f5b829-46e0-4048-9b51-1a9256375d4f" Oct 14 13:14:29 crc kubenswrapper[4837]: E1014 13:14:29.322531 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" podUID="922d6301-937e-403a-ade6-06620798c61c" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.337344 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.343885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/293d5905-c149-4fe1-a09d-204cc4cff4e6-cert\") pod \"openstack-operator-controller-manager-84c49f8869-sxmsq\" (UID: \"293d5905-c149-4fe1-a09d-204cc4cff4e6\") " pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.408632 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:29 crc kubenswrapper[4837]: I1014 13:14:29.657758 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq"] Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.102475 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" event={"ID":"293d5905-c149-4fe1-a09d-204cc4cff4e6","Type":"ContainerStarted","Data":"166ab560fa494edb3fccc05326743a5357e906a576e53810bfb588155f4513a9"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.102524 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" event={"ID":"293d5905-c149-4fe1-a09d-204cc4cff4e6","Type":"ContainerStarted","Data":"5eef65abf12dbf98549c42b6d953782bcba84860dfbd7876a40ce0115ac645cf"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.105797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" event={"ID":"e4f5b829-46e0-4048-9b51-1a9256375d4f","Type":"ContainerStarted","Data":"2d571578da053d1d1969705c1c724730c8c2acf736c896f5f0c949fd96c48327"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.105846 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" event={"ID":"e4f5b829-46e0-4048-9b51-1a9256375d4f","Type":"ContainerStarted","Data":"c32b51a154e191964b3ef52300eae308e932138930acb458ad2a9714b5059517"} Oct 14 13:14:30 crc kubenswrapper[4837]: E1014 13:14:30.107691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:47278ed28e02df00892f941763aa0d69547327318e8a983e07f4577acd288167\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" podUID="e4f5b829-46e0-4048-9b51-1a9256375d4f" Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.111032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" event={"ID":"f7815a82-8a77-47a1-8a07-966eb6340b2b","Type":"ContainerStarted","Data":"14c83113dd5764928ebaaaac75cd528444534ef3c350e3c51aac5513ce0b1d04"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.111079 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" event={"ID":"f7815a82-8a77-47a1-8a07-966eb6340b2b","Type":"ContainerStarted","Data":"b7b87198fe998f2f0ad29a428556baacea99e5dfb81bf01cf959db34d9324aec"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.114903 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" event={"ID":"e1d5f52e-4c67-4242-bea3-6eef9fb72623","Type":"ContainerStarted","Data":"78bd93ca911bd0f07b67acf1bee633ed78b5b6933fbcf6afec646479e5537c3b"} Oct 14 13:14:30 crc kubenswrapper[4837]: E1014 13:14:30.116294 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:063a7e65b4ba98f0506f269ff7525b446eae06a5ed4a61c18ffa33a886500867\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" podUID="f7815a82-8a77-47a1-8a07-966eb6340b2b" Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.116368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" event={"ID":"922d6301-937e-403a-ade6-06620798c61c","Type":"ContainerStarted","Data":"036e9a614c31bc787ddae98c6c1754c55d3954563759e8f786d116dd0684333c"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.116417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" event={"ID":"922d6301-937e-403a-ade6-06620798c61c","Type":"ContainerStarted","Data":"8e9aa792b8f6487c5ad4f7bcaa4b20d31bd707b5b7bff78a906da12c177e3c58"} Oct 14 13:14:30 crc kubenswrapper[4837]: E1014 13:14:30.120543 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" podUID="922d6301-937e-403a-ade6-06620798c61c" Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.123719 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" event={"ID":"6017e7af-9d95-42c3-9f9c-bbd3df49f4f4","Type":"ContainerStarted","Data":"379b84e6a867d2033bce1c61539e844623a7dfe8215d0845751a6f0859429563"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.127184 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" event={"ID":"c2182e6f-c24c-4164-a269-4c11d34057a7","Type":"ContainerStarted","Data":"a9ca7bcc68bc3ac64be50e49d13e12b1a75e5fe5156beaec0ae74e5daa64b0ba"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.127215 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" event={"ID":"c2182e6f-c24c-4164-a269-4c11d34057a7","Type":"ContainerStarted","Data":"09b887ab426a6a08c8cda7849aed2850001e365d595438aca0704d2664911c08"} Oct 14 13:14:30 crc kubenswrapper[4837]: E1014 13:14:30.128898 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" podUID="c2182e6f-c24c-4164-a269-4c11d34057a7" Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.129617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" event={"ID":"a086b7d2-5401-4754-9825-2425a3a2aa22","Type":"ContainerStarted","Data":"d3b17c4f87b57d4451d7daa46e99516d41f836178e96f685eb4c4f063a92735e"} Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.129644 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" event={"ID":"a086b7d2-5401-4754-9825-2425a3a2aa22","Type":"ContainerStarted","Data":"0ae40d5a751656b987a536c5a365277f8dad26f024be32b16c1f33f161f11d62"} Oct 14 13:14:30 crc kubenswrapper[4837]: E1014 13:14:30.130567 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" podUID="a086b7d2-5401-4754-9825-2425a3a2aa22" Oct 14 13:14:30 crc kubenswrapper[4837]: I1014 13:14:30.131787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" event={"ID":"be28404e-866e-4ffd-8cfc-a43090217244","Type":"ContainerStarted","Data":"a2b3604ffe2120c728c40c17960b78fd46557d67d63f23407ef895cb79bdd140"} Oct 14 13:14:30 crc kubenswrapper[4837]: E1014 13:14:30.133190 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" podUID="be28404e-866e-4ffd-8cfc-a43090217244" Oct 14 13:14:31 crc kubenswrapper[4837]: I1014 13:14:31.142095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" event={"ID":"293d5905-c149-4fe1-a09d-204cc4cff4e6","Type":"ContainerStarted","Data":"865ffd6f74ba4f6b390e9738c46f6f95889b1b415e5a9030d2cbbdb6534fb156"} Oct 14 13:14:31 crc kubenswrapper[4837]: I1014 13:14:31.143297 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:31 crc kubenswrapper[4837]: E1014 13:14:31.144388 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" podUID="be28404e-866e-4ffd-8cfc-a43090217244" Oct 14 13:14:31 crc kubenswrapper[4837]: E1014 13:14:31.144658 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:47278ed28e02df00892f941763aa0d69547327318e8a983e07f4577acd288167\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" podUID="e4f5b829-46e0-4048-9b51-1a9256375d4f" Oct 14 13:14:31 crc kubenswrapper[4837]: E1014 13:14:31.144724 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:063a7e65b4ba98f0506f269ff7525b446eae06a5ed4a61c18ffa33a886500867\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" podUID="f7815a82-8a77-47a1-8a07-966eb6340b2b" Oct 14 13:14:31 crc kubenswrapper[4837]: E1014 13:14:31.144734 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" podUID="a086b7d2-5401-4754-9825-2425a3a2aa22" Oct 14 13:14:31 crc kubenswrapper[4837]: E1014 13:14:31.144750 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" podUID="c2182e6f-c24c-4164-a269-4c11d34057a7" Oct 14 13:14:31 crc kubenswrapper[4837]: E1014 13:14:31.144922 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" podUID="922d6301-937e-403a-ade6-06620798c61c" Oct 14 13:14:31 crc kubenswrapper[4837]: I1014 13:14:31.172997 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" podStartSLOduration=4.172971178 podStartE2EDuration="4.172971178s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:14:31.172386343 +0000 UTC m=+809.089386156" watchObservedRunningTime="2025-10-14 13:14:31.172971178 +0000 UTC m=+809.089970991" Oct 14 13:14:34 crc kubenswrapper[4837]: I1014 13:14:34.850643 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:34 crc kubenswrapper[4837]: I1014 13:14:34.851040 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:34 crc kubenswrapper[4837]: I1014 13:14:34.923312 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:35 crc kubenswrapper[4837]: I1014 13:14:35.224871 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:35 crc kubenswrapper[4837]: I1014 13:14:35.271069 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfllp"] Oct 14 13:14:37 crc kubenswrapper[4837]: I1014 13:14:37.185445 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfllp" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="registry-server" containerID="cri-o://8525c639f40550529fb4e88435c480dfa4d7d1c638e64039a4c12887f0870806" gracePeriod=2 Oct 14 13:14:38 crc kubenswrapper[4837]: I1014 13:14:38.195732 4837 generic.go:334] "Generic (PLEG): container finished" podID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerID="8525c639f40550529fb4e88435c480dfa4d7d1c638e64039a4c12887f0870806" exitCode=0 Oct 14 13:14:38 crc kubenswrapper[4837]: I1014 13:14:38.195794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfllp" event={"ID":"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d","Type":"ContainerDied","Data":"8525c639f40550529fb4e88435c480dfa4d7d1c638e64039a4c12887f0870806"} Oct 14 13:14:39 crc kubenswrapper[4837]: I1014 13:14:39.421701 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-84c49f8869-sxmsq" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.050034 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.093238 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-catalog-content\") pod \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.093306 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9plk\" (UniqueName: \"kubernetes.io/projected/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-kube-api-access-d9plk\") pod \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.093419 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-utilities\") pod \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\" (UID: \"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d\") " Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.094356 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-utilities" (OuterVolumeSpecName: "utilities") pod "33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" (UID: "33af3f9b-bfd9-40e3-bd61-e63c24cfa89d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.099267 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-kube-api-access-d9plk" (OuterVolumeSpecName: "kube-api-access-d9plk") pod "33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" (UID: "33af3f9b-bfd9-40e3-bd61-e63c24cfa89d"). InnerVolumeSpecName "kube-api-access-d9plk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.111509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" (UID: "33af3f9b-bfd9-40e3-bd61-e63c24cfa89d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.194469 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.194497 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.194512 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9plk\" (UniqueName: \"kubernetes.io/projected/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d-kube-api-access-d9plk\") on node \"crc\" DevicePath \"\"" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.214033 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfllp" event={"ID":"33af3f9b-bfd9-40e3-bd61-e63c24cfa89d","Type":"ContainerDied","Data":"faea06d47bd37f6cba14232695797f837cd4ad40ee07a2a63ce192bd76efb043"} Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.214103 4837 scope.go:117] "RemoveContainer" containerID="8525c639f40550529fb4e88435c480dfa4d7d1c638e64039a4c12887f0870806" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.214051 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfllp" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.219875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" event={"ID":"cfce54d9-39e9-4b1f-bb95-11d72de2cbdc","Type":"ContainerStarted","Data":"4068e35e16c9e377a8489d0dfc0b1a6af5a868ab4b78573bb79178f9b6fdcc35"} Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.247116 4837 scope.go:117] "RemoveContainer" containerID="b33c8c495a4912bea2b2bc26f0214eaa462775a286a939850cf99dac94da8556" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.252197 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfllp"] Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.257431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfllp"] Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.302681 4837 scope.go:117] "RemoveContainer" containerID="eaad4635ff1dd9778658e4b13b0e7c46af418c63991eb641aab054c4c15fdd7e" Oct 14 13:14:40 crc kubenswrapper[4837]: I1014 13:14:40.805689 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" path="/var/lib/kubelet/pods/33af3f9b-bfd9-40e3-bd61-e63c24cfa89d/volumes" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.227121 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" event={"ID":"7f62a453-6fb4-4769-a2ef-da03024d8e90","Type":"ContainerStarted","Data":"3965660372d54487ef924dd9d72aeb6404d7dd1aca165a3b022afb36e839c655"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.227181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" event={"ID":"7f62a453-6fb4-4769-a2ef-da03024d8e90","Type":"ContainerStarted","Data":"2feb2608991658c4d8bed435996d2a36bc1701c9247fde71db1271a242a88eca"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.227219 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.228971 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" event={"ID":"62e66325-7f63-4815-9f2d-fafbd138fa4e","Type":"ContainerStarted","Data":"289e82d31044c5447e4380c8c8a7a59bb89b1771da0416f8bd7c47ba34a2efb5"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.230658 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" event={"ID":"640618dc-c509-410b-9669-9b77a1f8d068","Type":"ContainerStarted","Data":"0796f7f82e4c1ba4178c7c9a2e7688022a726b2676a7333a0a9e9bac93e114ca"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.235619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" event={"ID":"4d12bc33-de6d-405c-b539-72ab956b4234","Type":"ContainerStarted","Data":"aad6097b3156de92c73b4f97fbf054700f9fa68059b38a85aa763a09cae647f9"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.237481 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" event={"ID":"6017e7af-9d95-42c3-9f9c-bbd3df49f4f4","Type":"ContainerStarted","Data":"38c3c3740f7aeb6102cbf1a56cdeff3a9a26b0105e4bf5c93131366809325450"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.237522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" event={"ID":"6017e7af-9d95-42c3-9f9c-bbd3df49f4f4","Type":"ContainerStarted","Data":"b9a094777ac8cbe0c647b993591ffe5ba97fbf25f86448b11e700d6cd6bc0543"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.237614 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.264562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" podStartSLOduration=4.187879015 podStartE2EDuration="15.264537131s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.748075482 +0000 UTC m=+806.665075295" lastFinishedPulling="2025-10-14 13:14:39.824733598 +0000 UTC m=+817.741733411" observedRunningTime="2025-10-14 13:14:41.263550775 +0000 UTC m=+819.180550588" watchObservedRunningTime="2025-10-14 13:14:41.264537131 +0000 UTC m=+819.181536944" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.273841 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" event={"ID":"e1d5f52e-4c67-4242-bea3-6eef9fb72623","Type":"ContainerStarted","Data":"7c59e59bfc97cf854d5d17920f411604808362be955a7cc1708e59cedd457124"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.304400 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" event={"ID":"1ac92ea3-d385-42f1-bc27-59a93f495cbc","Type":"ContainerStarted","Data":"497450e97fe3c4f9b677c0216f7b0851a1b8ea004cdea454b695cd8154df8853"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.308828 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" podStartSLOduration=3.63789689 podStartE2EDuration="14.308811216s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:29.155870217 +0000 UTC m=+807.072870030" lastFinishedPulling="2025-10-14 13:14:39.826784533 +0000 UTC m=+817.743784356" observedRunningTime="2025-10-14 13:14:41.29450959 +0000 UTC m=+819.211509403" watchObservedRunningTime="2025-10-14 13:14:41.308811216 +0000 UTC m=+819.225811019" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.327047 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" event={"ID":"cfce54d9-39e9-4b1f-bb95-11d72de2cbdc","Type":"ContainerStarted","Data":"c5d84f1f9f5c2418fa65cb14852c8a07de342bcdc82d51ead4e9d5e7940da168"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.328008 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.348431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" event={"ID":"53370b8e-db35-4a50-af38-f24ac2fad459","Type":"ContainerStarted","Data":"27dd47f8b0f8a465fa1461e5c7647086d7bd2eca374d1547532db969b9aa12b4"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.362575 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" podStartSLOduration=4.054708391 podStartE2EDuration="15.362556896s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.363365201 +0000 UTC m=+806.280365014" lastFinishedPulling="2025-10-14 13:14:39.671213706 +0000 UTC m=+817.588213519" observedRunningTime="2025-10-14 13:14:41.358419555 +0000 UTC m=+819.275419368" watchObservedRunningTime="2025-10-14 13:14:41.362556896 +0000 UTC m=+819.279556709" Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.369403 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" event={"ID":"32cb1840-83d3-40ec-859a-15391e369bde","Type":"ContainerStarted","Data":"c82046129d129b60b36085c965d512416c263b95b8ad4e9db73eae774ebf1e91"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.383386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" event={"ID":"62be7f3d-ddbe-4470-ace0-0907330b09ac","Type":"ContainerStarted","Data":"3aa90cef2dfb6d7227da17d5a908160f3e93a53ea0ca49070e5c0d73bc160411"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.394224 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" event={"ID":"37e6419b-1647-43e2-89ef-67deae94e8b3","Type":"ContainerStarted","Data":"6fa90a530dcc91dc28c72cdf8a0af2911579cc5fd6ba297583c8d02c9cc096ee"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.403381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" event={"ID":"7d205182-3314-4282-800d-4dc57b64f416","Type":"ContainerStarted","Data":"8d9c287105ee0e237e75c4e9f6bd83fd9568f757e94c0ca9f440369259deaa41"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.411213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" event={"ID":"fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e","Type":"ContainerStarted","Data":"5d2d10807f7ece9391660a3170ef1aa6ccf4a9d81dbbe4b1374b644ae0e64f71"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.414878 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" event={"ID":"f915ddfd-5160-4f57-85a8-9b5fe02c1908","Type":"ContainerStarted","Data":"da31803cd8261adef5cb3249888fc7db2d7c39276a56d4fd5ae6a56baa11015f"} Oct 14 13:14:41 crc kubenswrapper[4837]: I1014 13:14:41.428396 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" event={"ID":"4f4fbd70-1ccf-4509-8552-ab902e8e7a0f","Type":"ContainerStarted","Data":"5bcb2b4a080e7381065ed84c4d0fb474b0e4b89641c5c3b203f7b7858c2dd5d8"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.435483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" event={"ID":"7d205182-3314-4282-800d-4dc57b64f416","Type":"ContainerStarted","Data":"ddb22e2c2d9a3a490846f5d845a8de3242c892021e502c80fddaa8a40c260e9e"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.435840 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.438682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" event={"ID":"62be7f3d-ddbe-4470-ace0-0907330b09ac","Type":"ContainerStarted","Data":"50440e79089cc1ffce41f38ce7033df3ee14e0df461c0f7f87050107a416abff"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.438829 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.440983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" event={"ID":"4f4fbd70-1ccf-4509-8552-ab902e8e7a0f","Type":"ContainerStarted","Data":"f6a1aa7895ac93703e03ec86a6d19e9e146531bc7f0785054cfaa6307b76ad3a"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.441186 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.442961 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" event={"ID":"37e6419b-1647-43e2-89ef-67deae94e8b3","Type":"ContainerStarted","Data":"9c159ef30097da96e2ee6751eb2139d7f57aff2258eb1d9751503f64b8214095"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.443014 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.444721 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" event={"ID":"53370b8e-db35-4a50-af38-f24ac2fad459","Type":"ContainerStarted","Data":"f63d254984c1e4f2eb406041be8af5f254ae9c1e0ea3cebcbb6dd760ed88201e"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.444847 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.448506 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" event={"ID":"4d12bc33-de6d-405c-b539-72ab956b4234","Type":"ContainerStarted","Data":"0f0502671aba233cf5bd568833881188a91864f48f54be7d9564317916350c31"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.449026 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.452725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" event={"ID":"32cb1840-83d3-40ec-859a-15391e369bde","Type":"ContainerStarted","Data":"733366130d3a46f33d0e75d9694b64e969746bc453d43c159fafe160f5d7b0f9"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.452897 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.454640 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" event={"ID":"fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e","Type":"ContainerStarted","Data":"6293f202792b8d11c7b5d8a4944332acf8e15e68e651e8936179936dba258902"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.454780 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.456328 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" podStartSLOduration=4.562278115 podStartE2EDuration="15.456307272s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.930747812 +0000 UTC m=+806.847747625" lastFinishedPulling="2025-10-14 13:14:39.824776979 +0000 UTC m=+817.741776782" observedRunningTime="2025-10-14 13:14:42.450505415 +0000 UTC m=+820.367505258" watchObservedRunningTime="2025-10-14 13:14:42.456307272 +0000 UTC m=+820.373307115" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.457366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" event={"ID":"62e66325-7f63-4815-9f2d-fafbd138fa4e","Type":"ContainerStarted","Data":"b8a4924b2f48e82ac9610a419daf184f00708f27a251eaacbd4d93326d996c20"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.457627 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.458950 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" event={"ID":"e1d5f52e-4c67-4242-bea3-6eef9fb72623","Type":"ContainerStarted","Data":"c0201baee14bea1282e3dffb57b51c86d818689b6f7d448c3c3a8192e0d025db"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.459101 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.460799 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" event={"ID":"f915ddfd-5160-4f57-85a8-9b5fe02c1908","Type":"ContainerStarted","Data":"8a7511db309b8bb39ac15d1de72555025a0c8c37e2d859365309ccbb54a394c6"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.460977 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.464321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" event={"ID":"1ac92ea3-d385-42f1-bc27-59a93f495cbc","Type":"ContainerStarted","Data":"1a5f87b0b6056e7d89039d2061b970b926a0f0a3e545c05b6f606532d2ca8dec"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.464434 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.466259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" event={"ID":"640618dc-c509-410b-9669-9b77a1f8d068","Type":"ContainerStarted","Data":"0eb61ad48c02bb27c0fe091c105a50b83808e67fd0aba15ddfbc805da8e2061b"} Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.466398 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.472673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" podStartSLOduration=5.57994243 podStartE2EDuration="16.472663063s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.879915409 +0000 UTC m=+806.796915222" lastFinishedPulling="2025-10-14 13:14:39.772636042 +0000 UTC m=+817.689635855" observedRunningTime="2025-10-14 13:14:42.471287716 +0000 UTC m=+820.388287629" watchObservedRunningTime="2025-10-14 13:14:42.472663063 +0000 UTC m=+820.389662876" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.487531 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" podStartSLOduration=4.772784311 podStartE2EDuration="16.487511344s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.006418129 +0000 UTC m=+805.923417942" lastFinishedPulling="2025-10-14 13:14:39.721145162 +0000 UTC m=+817.638144975" observedRunningTime="2025-10-14 13:14:42.486837776 +0000 UTC m=+820.403837609" watchObservedRunningTime="2025-10-14 13:14:42.487511344 +0000 UTC m=+820.404511167" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.506468 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" podStartSLOduration=5.442984976 podStartE2EDuration="16.506448355s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.763289493 +0000 UTC m=+806.680289306" lastFinishedPulling="2025-10-14 13:14:39.826752872 +0000 UTC m=+817.743752685" observedRunningTime="2025-10-14 13:14:42.50480732 +0000 UTC m=+820.421807143" watchObservedRunningTime="2025-10-14 13:14:42.506448355 +0000 UTC m=+820.423448168" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.526388 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" podStartSLOduration=4.768609527 podStartE2EDuration="16.526371442s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.015657808 +0000 UTC m=+805.932657621" lastFinishedPulling="2025-10-14 13:14:39.773419683 +0000 UTC m=+817.690419536" observedRunningTime="2025-10-14 13:14:42.524842041 +0000 UTC m=+820.441841894" watchObservedRunningTime="2025-10-14 13:14:42.526371442 +0000 UTC m=+820.443371255" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.548180 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" podStartSLOduration=5.746246818 podStartE2EDuration="16.54814331s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.912765566 +0000 UTC m=+806.829765379" lastFinishedPulling="2025-10-14 13:14:39.714662038 +0000 UTC m=+817.631661871" observedRunningTime="2025-10-14 13:14:42.546458914 +0000 UTC m=+820.463458727" watchObservedRunningTime="2025-10-14 13:14:42.54814331 +0000 UTC m=+820.465143133" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.568782 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" podStartSLOduration=4.692959785 podStartE2EDuration="16.568759246s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:27.952263517 +0000 UTC m=+805.869263320" lastFinishedPulling="2025-10-14 13:14:39.828062968 +0000 UTC m=+817.745062781" observedRunningTime="2025-10-14 13:14:42.561483379 +0000 UTC m=+820.478483202" watchObservedRunningTime="2025-10-14 13:14:42.568759246 +0000 UTC m=+820.485759069" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.581019 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" podStartSLOduration=5.28310247 podStartE2EDuration="16.580998446s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.379976609 +0000 UTC m=+806.296976412" lastFinishedPulling="2025-10-14 13:14:39.677872565 +0000 UTC m=+817.594872388" observedRunningTime="2025-10-14 13:14:42.577634436 +0000 UTC m=+820.494634249" watchObservedRunningTime="2025-10-14 13:14:42.580998446 +0000 UTC m=+820.497998269" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.594813 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" podStartSLOduration=5.646488447 podStartE2EDuration="16.594796499s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.880232189 +0000 UTC m=+806.797231992" lastFinishedPulling="2025-10-14 13:14:39.828540231 +0000 UTC m=+817.745540044" observedRunningTime="2025-10-14 13:14:42.591016717 +0000 UTC m=+820.508016540" watchObservedRunningTime="2025-10-14 13:14:42.594796499 +0000 UTC m=+820.511796322" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.613839 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" podStartSLOduration=5.756180127 podStartE2EDuration="16.613812912s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.96960442 +0000 UTC m=+806.886604233" lastFinishedPulling="2025-10-14 13:14:39.827237205 +0000 UTC m=+817.744237018" observedRunningTime="2025-10-14 13:14:42.612474665 +0000 UTC m=+820.529474478" watchObservedRunningTime="2025-10-14 13:14:42.613812912 +0000 UTC m=+820.530812725" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.637699 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" podStartSLOduration=4.549058796 podStartE2EDuration="15.637668585s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.737401554 +0000 UTC m=+806.654401357" lastFinishedPulling="2025-10-14 13:14:39.826011333 +0000 UTC m=+817.743011146" observedRunningTime="2025-10-14 13:14:42.62857959 +0000 UTC m=+820.545579403" watchObservedRunningTime="2025-10-14 13:14:42.637668585 +0000 UTC m=+820.554668438" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.658229 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" podStartSLOduration=5.753985347 podStartE2EDuration="16.65820548s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.767003033 +0000 UTC m=+806.684002846" lastFinishedPulling="2025-10-14 13:14:39.671223166 +0000 UTC m=+817.588222979" observedRunningTime="2025-10-14 13:14:42.653049941 +0000 UTC m=+820.570049764" watchObservedRunningTime="2025-10-14 13:14:42.65820548 +0000 UTC m=+820.575205313" Oct 14 13:14:42 crc kubenswrapper[4837]: I1014 13:14:42.669374 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" podStartSLOduration=4.722796046 podStartE2EDuration="15.669351561s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.826750455 +0000 UTC m=+806.743750268" lastFinishedPulling="2025-10-14 13:14:39.77330597 +0000 UTC m=+817.690305783" observedRunningTime="2025-10-14 13:14:42.667043088 +0000 UTC m=+820.584042901" watchObservedRunningTime="2025-10-14 13:14:42.669351561 +0000 UTC m=+820.586351384" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.132846 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-jczn9" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.145319 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5f627" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.170862 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-q9dmc" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.204661 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qkd6g" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.243975 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-n6pcv" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.255895 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-7xv4c" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.285355 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-87zsz" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.300057 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-fq8p9" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.405541 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-49s6c" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.413033 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-b2hkp" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.528779 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-6w7km" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.531231 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-hx2m5" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.571090 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-mlhgx" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.603488 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-t8t4r" Oct 14 13:14:47 crc kubenswrapper[4837]: I1014 13:14:47.623514 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-78pkj" Oct 14 13:14:48 crc kubenswrapper[4837]: I1014 13:14:48.029500 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-rfdvw" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.546425 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" event={"ID":"be28404e-866e-4ffd-8cfc-a43090217244","Type":"ContainerStarted","Data":"2ed729cdaff19cc0b29b183a4031afa6a149658e820fd623f728d9b0cdead95f"} Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.548709 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" event={"ID":"f7815a82-8a77-47a1-8a07-966eb6340b2b","Type":"ContainerStarted","Data":"9bf75c4e5ed22a04b877d6459ee2a5dcfea4c79da9be50c68096c0a125a8a3b7"} Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.548992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.550151 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" event={"ID":"c2182e6f-c24c-4164-a269-4c11d34057a7","Type":"ContainerStarted","Data":"191191774a3607e29c386e77cce9f779a676b2c0c614527ff3b94a12c82fd0d0"} Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.550347 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.551689 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" event={"ID":"a086b7d2-5401-4754-9825-2425a3a2aa22","Type":"ContainerStarted","Data":"caa84c284ecdfeb9a25e4ebce0264e05895f2f71fc030975eeef2fea3776e5b2"} Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.551889 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.553277 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" event={"ID":"922d6301-937e-403a-ade6-06620798c61c","Type":"ContainerStarted","Data":"072eb4428914152115efb97d765bb4e1bde67049c6a0eb8c3da9ca464545e92d"} Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.553467 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.554670 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" event={"ID":"e4f5b829-46e0-4048-9b51-1a9256375d4f","Type":"ContainerStarted","Data":"5b7b5c9c7c32dc6fd3d9a9e4a172a616189644fd5591a3ecdec00da8dca9e6c1"} Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.554833 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.566061 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs" podStartSLOduration=2.9099844470000003 podStartE2EDuration="24.56604153s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:29.122538537 +0000 UTC m=+807.039538350" lastFinishedPulling="2025-10-14 13:14:50.77859562 +0000 UTC m=+828.695595433" observedRunningTime="2025-10-14 13:14:51.562042901 +0000 UTC m=+829.479042714" watchObservedRunningTime="2025-10-14 13:14:51.56604153 +0000 UTC m=+829.483041343" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.585089 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" podStartSLOduration=2.772729563 podStartE2EDuration="24.585068533s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.969902759 +0000 UTC m=+806.886902572" lastFinishedPulling="2025-10-14 13:14:50.782241729 +0000 UTC m=+828.699241542" observedRunningTime="2025-10-14 13:14:51.574074886 +0000 UTC m=+829.491074719" watchObservedRunningTime="2025-10-14 13:14:51.585068533 +0000 UTC m=+829.502068346" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.593643 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" podStartSLOduration=2.750880731 podStartE2EDuration="24.593625523s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.936129546 +0000 UTC m=+806.853129359" lastFinishedPulling="2025-10-14 13:14:50.778874338 +0000 UTC m=+828.695874151" observedRunningTime="2025-10-14 13:14:51.591485436 +0000 UTC m=+829.508485249" watchObservedRunningTime="2025-10-14 13:14:51.593625523 +0000 UTC m=+829.510625336" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.610891 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" podStartSLOduration=3.830326558 podStartE2EDuration="25.61087542s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:28.999346063 +0000 UTC m=+806.916345876" lastFinishedPulling="2025-10-14 13:14:50.779894915 +0000 UTC m=+828.696894738" observedRunningTime="2025-10-14 13:14:51.607752795 +0000 UTC m=+829.524752608" watchObservedRunningTime="2025-10-14 13:14:51.61087542 +0000 UTC m=+829.527875233" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.623053 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" podStartSLOduration=3.845883046 podStartE2EDuration="25.623038828s" podCreationTimestamp="2025-10-14 13:14:26 +0000 UTC" firstStartedPulling="2025-10-14 13:14:29.002721233 +0000 UTC m=+806.919721036" lastFinishedPulling="2025-10-14 13:14:50.779877005 +0000 UTC m=+828.696876818" observedRunningTime="2025-10-14 13:14:51.620838438 +0000 UTC m=+829.537838251" watchObservedRunningTime="2025-10-14 13:14:51.623038828 +0000 UTC m=+829.540038641" Oct 14 13:14:51 crc kubenswrapper[4837]: I1014 13:14:51.646992 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" podStartSLOduration=2.995949905 podStartE2EDuration="24.646975703s" podCreationTimestamp="2025-10-14 13:14:27 +0000 UTC" firstStartedPulling="2025-10-14 13:14:29.16268379 +0000 UTC m=+807.079683603" lastFinishedPulling="2025-10-14 13:14:50.813709588 +0000 UTC m=+828.730709401" observedRunningTime="2025-10-14 13:14:51.645548625 +0000 UTC m=+829.562548438" watchObservedRunningTime="2025-10-14 13:14:51.646975703 +0000 UTC m=+829.563975516" Oct 14 13:14:57 crc kubenswrapper[4837]: I1014 13:14:57.406453 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-mdzbv" Oct 14 13:14:57 crc kubenswrapper[4837]: I1014 13:14:57.497996 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-9hctf" Oct 14 13:14:57 crc kubenswrapper[4837]: I1014 13:14:57.614286 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-jkjjd" Oct 14 13:14:57 crc kubenswrapper[4837]: I1014 13:14:57.663317 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-mhs2q" Oct 14 13:14:58 crc kubenswrapper[4837]: I1014 13:14:58.141023 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.138690 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln"] Oct 14 13:15:00 crc kubenswrapper[4837]: E1014 13:15:00.139230 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="extract-content" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.139243 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="extract-content" Oct 14 13:15:00 crc kubenswrapper[4837]: E1014 13:15:00.139290 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="extract-utilities" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.139298 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="extract-utilities" Oct 14 13:15:00 crc kubenswrapper[4837]: E1014 13:15:00.139323 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="registry-server" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.139329 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="registry-server" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.139448 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="33af3f9b-bfd9-40e3-bd61-e63c24cfa89d" containerName="registry-server" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.139888 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.145649 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.145871 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.149572 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln"] Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.215376 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5c1cd8b-17e5-454d-8925-f75cd0539c12-secret-volume\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.215496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmctz\" (UniqueName: \"kubernetes.io/projected/e5c1cd8b-17e5-454d-8925-f75cd0539c12-kube-api-access-nmctz\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.215549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5c1cd8b-17e5-454d-8925-f75cd0539c12-config-volume\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.317375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5c1cd8b-17e5-454d-8925-f75cd0539c12-secret-volume\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.317477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmctz\" (UniqueName: \"kubernetes.io/projected/e5c1cd8b-17e5-454d-8925-f75cd0539c12-kube-api-access-nmctz\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.317523 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5c1cd8b-17e5-454d-8925-f75cd0539c12-config-volume\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.318634 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5c1cd8b-17e5-454d-8925-f75cd0539c12-config-volume\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.324543 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5c1cd8b-17e5-454d-8925-f75cd0539c12-secret-volume\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.337677 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmctz\" (UniqueName: \"kubernetes.io/projected/e5c1cd8b-17e5-454d-8925-f75cd0539c12-kube-api-access-nmctz\") pod \"collect-profiles-29340795-9cnln\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.459848 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:00 crc kubenswrapper[4837]: I1014 13:15:00.905108 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln"] Oct 14 13:15:01 crc kubenswrapper[4837]: I1014 13:15:01.625540 4837 generic.go:334] "Generic (PLEG): container finished" podID="e5c1cd8b-17e5-454d-8925-f75cd0539c12" containerID="89b8d4715045f5249b4eb01bd2eedd2847326fe345ca530a88bf6de970df01cd" exitCode=0 Oct 14 13:15:01 crc kubenswrapper[4837]: I1014 13:15:01.625636 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" event={"ID":"e5c1cd8b-17e5-454d-8925-f75cd0539c12","Type":"ContainerDied","Data":"89b8d4715045f5249b4eb01bd2eedd2847326fe345ca530a88bf6de970df01cd"} Oct 14 13:15:01 crc kubenswrapper[4837]: I1014 13:15:01.625815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" event={"ID":"e5c1cd8b-17e5-454d-8925-f75cd0539c12","Type":"ContainerStarted","Data":"c13070bae3e26dd74a91dab85fc1d7978a07fb2797a7c4e402633d83611d6edf"} Oct 14 13:15:02 crc kubenswrapper[4837]: I1014 13:15:02.997586 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.061232 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmctz\" (UniqueName: \"kubernetes.io/projected/e5c1cd8b-17e5-454d-8925-f75cd0539c12-kube-api-access-nmctz\") pod \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.061297 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5c1cd8b-17e5-454d-8925-f75cd0539c12-secret-volume\") pod \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.061317 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5c1cd8b-17e5-454d-8925-f75cd0539c12-config-volume\") pod \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\" (UID: \"e5c1cd8b-17e5-454d-8925-f75cd0539c12\") " Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.061932 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c1cd8b-17e5-454d-8925-f75cd0539c12-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5c1cd8b-17e5-454d-8925-f75cd0539c12" (UID: "e5c1cd8b-17e5-454d-8925-f75cd0539c12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.066064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c1cd8b-17e5-454d-8925-f75cd0539c12-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5c1cd8b-17e5-454d-8925-f75cd0539c12" (UID: "e5c1cd8b-17e5-454d-8925-f75cd0539c12"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.066558 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c1cd8b-17e5-454d-8925-f75cd0539c12-kube-api-access-nmctz" (OuterVolumeSpecName: "kube-api-access-nmctz") pod "e5c1cd8b-17e5-454d-8925-f75cd0539c12" (UID: "e5c1cd8b-17e5-454d-8925-f75cd0539c12"). InnerVolumeSpecName "kube-api-access-nmctz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.162671 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmctz\" (UniqueName: \"kubernetes.io/projected/e5c1cd8b-17e5-454d-8925-f75cd0539c12-kube-api-access-nmctz\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.162736 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5c1cd8b-17e5-454d-8925-f75cd0539c12-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.162765 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5c1cd8b-17e5-454d-8925-f75cd0539c12-config-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.656395 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" event={"ID":"e5c1cd8b-17e5-454d-8925-f75cd0539c12","Type":"ContainerDied","Data":"c13070bae3e26dd74a91dab85fc1d7978a07fb2797a7c4e402633d83611d6edf"} Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.656791 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c13070bae3e26dd74a91dab85fc1d7978a07fb2797a7c4e402633d83611d6edf" Oct 14 13:15:03 crc kubenswrapper[4837]: I1014 13:15:03.656445 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.923540 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7x5lf"] Oct 14 13:15:13 crc kubenswrapper[4837]: E1014 13:15:13.924702 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c1cd8b-17e5-454d-8925-f75cd0539c12" containerName="collect-profiles" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.924721 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c1cd8b-17e5-454d-8925-f75cd0539c12" containerName="collect-profiles" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.924918 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c1cd8b-17e5-454d-8925-f75cd0539c12" containerName="collect-profiles" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.925861 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.929083 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.929731 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7x5lf"] Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.934019 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.934135 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 14 13:15:13 crc kubenswrapper[4837]: I1014 13:15:13.934417 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8xrss" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.004172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94c8\" (UniqueName: \"kubernetes.io/projected/f852e053-d91f-485f-8b38-7545a1618c5e-kube-api-access-n94c8\") pod \"dnsmasq-dns-675f4bcbfc-7x5lf\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.004251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f852e053-d91f-485f-8b38-7545a1618c5e-config\") pod \"dnsmasq-dns-675f4bcbfc-7x5lf\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.022403 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tnmsl"] Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.024046 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.025669 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.039900 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tnmsl"] Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.105692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n94c8\" (UniqueName: \"kubernetes.io/projected/f852e053-d91f-485f-8b38-7545a1618c5e-kube-api-access-n94c8\") pod \"dnsmasq-dns-675f4bcbfc-7x5lf\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.105818 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f852e053-d91f-485f-8b38-7545a1618c5e-config\") pod \"dnsmasq-dns-675f4bcbfc-7x5lf\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.107243 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f852e053-d91f-485f-8b38-7545a1618c5e-config\") pod \"dnsmasq-dns-675f4bcbfc-7x5lf\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.127220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94c8\" (UniqueName: \"kubernetes.io/projected/f852e053-d91f-485f-8b38-7545a1618c5e-kube-api-access-n94c8\") pod \"dnsmasq-dns-675f4bcbfc-7x5lf\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.207342 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-config\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.207737 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.207764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gmw\" (UniqueName: \"kubernetes.io/projected/37b086f0-09e1-4f81-a31e-032c4c418040-kube-api-access-94gmw\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.250701 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.309222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-config\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.309566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.309650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94gmw\" (UniqueName: \"kubernetes.io/projected/37b086f0-09e1-4f81-a31e-032c4c418040-kube-api-access-94gmw\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.310089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-config\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.310311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.349477 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gmw\" (UniqueName: \"kubernetes.io/projected/37b086f0-09e1-4f81-a31e-032c4c418040-kube-api-access-94gmw\") pod \"dnsmasq-dns-78dd6ddcc-tnmsl\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.641570 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.685210 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7x5lf"] Oct 14 13:15:14 crc kubenswrapper[4837]: I1014 13:15:14.731994 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" event={"ID":"f852e053-d91f-485f-8b38-7545a1618c5e","Type":"ContainerStarted","Data":"1c5dc872915180ceb271981ff90f90faddaccadf3ce37c583ead60c9244b89cf"} Oct 14 13:15:15 crc kubenswrapper[4837]: I1014 13:15:15.076712 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tnmsl"] Oct 14 13:15:15 crc kubenswrapper[4837]: W1014 13:15:15.079252 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b086f0_09e1_4f81_a31e_032c4c418040.slice/crio-3e637db39dcc6db68c17d6c8baff5bbdd8c5a3025f24439fa6562165d719e15f WatchSource:0}: Error finding container 3e637db39dcc6db68c17d6c8baff5bbdd8c5a3025f24439fa6562165d719e15f: Status 404 returned error can't find the container with id 3e637db39dcc6db68c17d6c8baff5bbdd8c5a3025f24439fa6562165d719e15f Oct 14 13:15:15 crc kubenswrapper[4837]: I1014 13:15:15.745686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" event={"ID":"37b086f0-09e1-4f81-a31e-032c4c418040","Type":"ContainerStarted","Data":"3e637db39dcc6db68c17d6c8baff5bbdd8c5a3025f24439fa6562165d719e15f"} Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.100931 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7x5lf"] Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.128972 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-drzz4"] Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.130463 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.156808 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-drzz4"] Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.159754 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9m8k\" (UniqueName: \"kubernetes.io/projected/ee5dcc94-8303-441c-b231-7a619a26c732-kube-api-access-g9m8k\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.159823 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-config\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.159872 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.261212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9m8k\" (UniqueName: \"kubernetes.io/projected/ee5dcc94-8303-441c-b231-7a619a26c732-kube-api-access-g9m8k\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.261546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-config\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.261596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.262438 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.263550 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-config\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.295431 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9m8k\" (UniqueName: \"kubernetes.io/projected/ee5dcc94-8303-441c-b231-7a619a26c732-kube-api-access-g9m8k\") pod \"dnsmasq-dns-5ccc8479f9-drzz4\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.407191 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tnmsl"] Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.427073 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7v48"] Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.428244 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.454613 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7v48"] Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.463668 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-config\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.463795 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.463855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg7bt\" (UniqueName: \"kubernetes.io/projected/b029d6b5-4398-42bf-abb7-30bd70de9142-kube-api-access-sg7bt\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.548085 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.566830 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-config\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.566926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.566966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg7bt\" (UniqueName: \"kubernetes.io/projected/b029d6b5-4398-42bf-abb7-30bd70de9142-kube-api-access-sg7bt\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.568151 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-config\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.568660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.598663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg7bt\" (UniqueName: \"kubernetes.io/projected/b029d6b5-4398-42bf-abb7-30bd70de9142-kube-api-access-sg7bt\") pod \"dnsmasq-dns-57d769cc4f-v7v48\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:17 crc kubenswrapper[4837]: I1014 13:15:17.756356 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.009324 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-drzz4"] Oct 14 13:15:18 crc kubenswrapper[4837]: W1014 13:15:18.016974 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5dcc94_8303_441c_b231_7a619a26c732.slice/crio-380fc70f745822768800351a2552560c61dafd7977668f4a5fd5255f901c799f WatchSource:0}: Error finding container 380fc70f745822768800351a2552560c61dafd7977668f4a5fd5255f901c799f: Status 404 returned error can't find the container with id 380fc70f745822768800351a2552560c61dafd7977668f4a5fd5255f901c799f Oct 14 13:15:18 crc kubenswrapper[4837]: W1014 13:15:18.193420 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb029d6b5_4398_42bf_abb7_30bd70de9142.slice/crio-8c9262825bf08b4e25a2d024e2d2d47facd80ce3a2992659a722bca2db235d50 WatchSource:0}: Error finding container 8c9262825bf08b4e25a2d024e2d2d47facd80ce3a2992659a722bca2db235d50: Status 404 returned error can't find the container with id 8c9262825bf08b4e25a2d024e2d2d47facd80ce3a2992659a722bca2db235d50 Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.194234 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7v48"] Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.294607 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.297019 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.298757 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.300651 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.300822 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.300947 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.301050 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.301188 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.301282 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-krn9k" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.303778 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.480959 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfnk\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-kube-api-access-nnfnk\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481005 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481060 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481141 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481244 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6877e694-37ca-4cd4-ba01-3101d4f7ade4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481302 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481330 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481402 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481460 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6877e694-37ca-4cd4-ba01-3101d4f7ade4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.481492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582761 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582808 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582873 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6877e694-37ca-4cd4-ba01-3101d4f7ade4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582892 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582905 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6877e694-37ca-4cd4-ba01-3101d4f7ade4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.582984 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.583029 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfnk\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-kube-api-access-nnfnk\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.583723 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.583931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.586987 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.587200 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.587663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.590239 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.591043 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6877e694-37ca-4cd4-ba01-3101d4f7ade4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.591875 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6877e694-37ca-4cd4-ba01-3101d4f7ade4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.601524 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.602232 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.607530 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.608917 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.611624 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfnk\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-kube-api-access-nnfnk\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.615870 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.616072 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.616669 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.616958 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.617044 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gstcl" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.617209 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.617870 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.621861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.660270 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.778695 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" event={"ID":"b029d6b5-4398-42bf-abb7-30bd70de9142","Type":"ContainerStarted","Data":"8c9262825bf08b4e25a2d024e2d2d47facd80ce3a2992659a722bca2db235d50"} Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.779849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" event={"ID":"ee5dcc94-8303-441c-b231-7a619a26c732","Type":"ContainerStarted","Data":"380fc70f745822768800351a2552560c61dafd7977668f4a5fd5255f901c799f"} Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.785853 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4tq\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-kube-api-access-9s4tq\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.785896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.785918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0fcd80e-9aec-4608-bfc5-653c443d1849-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.785957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.785984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.786014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.786052 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0fcd80e-9aec-4608-bfc5-653c443d1849-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.786074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.786099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.786121 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.786148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889841 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0fcd80e-9aec-4608-bfc5-653c443d1849-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.889977 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4tq\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-kube-api-access-9s4tq\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.890000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.890031 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0fcd80e-9aec-4608-bfc5-653c443d1849-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.890707 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.891502 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.891598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.892763 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.893683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.895043 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.905217 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0fcd80e-9aec-4608-bfc5-653c443d1849-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.907398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.907840 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.916026 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0fcd80e-9aec-4608-bfc5-653c443d1849-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.917996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4tq\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-kube-api-access-9s4tq\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.942477 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:15:18 crc kubenswrapper[4837]: I1014 13:15:18.985427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " pod="openstack/rabbitmq-server-0" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.010149 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.926013 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.927252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.931150 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mcxrb" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.931829 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.945363 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.945612 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.945748 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.945625 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 14 13:15:19 crc kubenswrapper[4837]: I1014 13:15:19.955409 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.112852 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-kolla-config\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.112943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-operator-scripts\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lsnj\" (UniqueName: \"kubernetes.io/projected/162a8777-0979-4087-959a-98cd20678758-kube-api-access-4lsnj\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113190 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/162a8777-0979-4087-959a-98cd20678758-config-data-generated\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-config-data-default\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113334 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-secrets\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.113453 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-kolla-config\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277822 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-operator-scripts\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lsnj\" (UniqueName: \"kubernetes.io/projected/162a8777-0979-4087-959a-98cd20678758-kube-api-access-4lsnj\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/162a8777-0979-4087-959a-98cd20678758-config-data-generated\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.277997 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-config-data-default\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.278023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-secrets\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.279426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/162a8777-0979-4087-959a-98cd20678758-config-data-generated\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.280688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-config-data-default\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.280736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-kolla-config\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.280814 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.283689 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/162a8777-0979-4087-959a-98cd20678758-operator-scripts\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.283730 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.286325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.288819 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/162a8777-0979-4087-959a-98cd20678758-secrets\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.328380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.343504 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lsnj\" (UniqueName: \"kubernetes.io/projected/162a8777-0979-4087-959a-98cd20678758-kube-api-access-4lsnj\") pod \"openstack-galera-0\" (UID: \"162a8777-0979-4087-959a-98cd20678758\") " pod="openstack/openstack-galera-0" Oct 14 13:15:20 crc kubenswrapper[4837]: I1014 13:15:20.554806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.333325 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.336756 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.339758 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-z7tqq" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.339921 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.340273 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.340430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.348010 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495088 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495239 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495287 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495347 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhm6r\" (UniqueName: \"kubernetes.io/projected/11ead61d-f315-4ee0-9dcb-a222012a9c36-kube-api-access-jhm6r\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11ead61d-f315-4ee0-9dcb-a222012a9c36-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495467 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.495493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.533968 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.535242 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.538063 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xd7mw" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.538301 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.538811 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.554119 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.596733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.596854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.597029 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.597701 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.597761 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.598225 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.598264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.598298 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhm6r\" (UniqueName: \"kubernetes.io/projected/11ead61d-f315-4ee0-9dcb-a222012a9c36-kube-api-access-jhm6r\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.598353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11ead61d-f315-4ee0-9dcb-a222012a9c36-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.598389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.598411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.599646 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.601964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.602289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11ead61d-f315-4ee0-9dcb-a222012a9c36-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.602417 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11ead61d-f315-4ee0-9dcb-a222012a9c36-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.607031 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.609584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11ead61d-f315-4ee0-9dcb-a222012a9c36-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.625857 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhm6r\" (UniqueName: \"kubernetes.io/projected/11ead61d-f315-4ee0-9dcb-a222012a9c36-kube-api-access-jhm6r\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.633322 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"11ead61d-f315-4ee0-9dcb-a222012a9c36\") " pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.701049 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0454524b-c83b-4049-ad05-8b29a317bc91-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.701212 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0454524b-c83b-4049-ad05-8b29a317bc91-config-data\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.701289 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454524b-c83b-4049-ad05-8b29a317bc91-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.701408 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0454524b-c83b-4049-ad05-8b29a317bc91-kolla-config\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.701470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnrw\" (UniqueName: \"kubernetes.io/projected/0454524b-c83b-4049-ad05-8b29a317bc91-kube-api-access-ffnrw\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.707816 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.802495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0454524b-c83b-4049-ad05-8b29a317bc91-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.802578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0454524b-c83b-4049-ad05-8b29a317bc91-config-data\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.802608 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454524b-c83b-4049-ad05-8b29a317bc91-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.802660 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0454524b-c83b-4049-ad05-8b29a317bc91-kolla-config\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.802707 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnrw\" (UniqueName: \"kubernetes.io/projected/0454524b-c83b-4049-ad05-8b29a317bc91-kube-api-access-ffnrw\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.804431 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0454524b-c83b-4049-ad05-8b29a317bc91-kolla-config\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.804704 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0454524b-c83b-4049-ad05-8b29a317bc91-config-data\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.808303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0454524b-c83b-4049-ad05-8b29a317bc91-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.818631 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454524b-c83b-4049-ad05-8b29a317bc91-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.821262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnrw\" (UniqueName: \"kubernetes.io/projected/0454524b-c83b-4049-ad05-8b29a317bc91-kube-api-access-ffnrw\") pod \"memcached-0\" (UID: \"0454524b-c83b-4049-ad05-8b29a317bc91\") " pod="openstack/memcached-0" Oct 14 13:15:21 crc kubenswrapper[4837]: I1014 13:15:21.851881 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.415490 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.416502 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.420330 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vz2m7" Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.435118 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.532245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jt9z\" (UniqueName: \"kubernetes.io/projected/561b6993-3913-4d1a-89d8-c146f5e2bd6a-kube-api-access-9jt9z\") pod \"kube-state-metrics-0\" (UID: \"561b6993-3913-4d1a-89d8-c146f5e2bd6a\") " pod="openstack/kube-state-metrics-0" Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.633961 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jt9z\" (UniqueName: \"kubernetes.io/projected/561b6993-3913-4d1a-89d8-c146f5e2bd6a-kube-api-access-9jt9z\") pod \"kube-state-metrics-0\" (UID: \"561b6993-3913-4d1a-89d8-c146f5e2bd6a\") " pod="openstack/kube-state-metrics-0" Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.654832 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jt9z\" (UniqueName: \"kubernetes.io/projected/561b6993-3913-4d1a-89d8-c146f5e2bd6a-kube-api-access-9jt9z\") pod \"kube-state-metrics-0\" (UID: \"561b6993-3913-4d1a-89d8-c146f5e2bd6a\") " pod="openstack/kube-state-metrics-0" Oct 14 13:15:23 crc kubenswrapper[4837]: I1014 13:15:23.734757 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.417368 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j4tpc"] Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.418710 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.423527 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.424310 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.424339 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j4tpc"] Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.424978 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-q82jh" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.485917 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cp8xg"] Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.487752 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.492873 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cp8xg"] Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.590604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-etc-ovs\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.590989 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-scripts\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.591146 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-log\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.591365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88df\" (UniqueName: \"kubernetes.io/projected/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-kube-api-access-b88df\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.591493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-run\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.591610 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-lib\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.591751 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-run\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.591850 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f970e0-8d42-46d6-937a-c39f521f6bea-scripts\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.592029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f970e0-8d42-46d6-937a-c39f521f6bea-ovn-controller-tls-certs\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.592272 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvtk2\" (UniqueName: \"kubernetes.io/projected/14f970e0-8d42-46d6-937a-c39f521f6bea-kube-api-access-mvtk2\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.592495 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-log-ovn\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.592658 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-run-ovn\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.592791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f970e0-8d42-46d6-937a-c39f521f6bea-combined-ca-bundle\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.662725 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.664021 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.666490 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.671308 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.671603 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.671810 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kwkt6" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.671615 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.687246 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.705470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-etc-ovs\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.705765 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-scripts\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.705880 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-log\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.706515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-log\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.707295 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88df\" (UniqueName: \"kubernetes.io/projected/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-kube-api-access-b88df\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.707425 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-run\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.707548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-lib\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.707708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-run\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.710889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f970e0-8d42-46d6-937a-c39f521f6bea-scripts\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.711058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f970e0-8d42-46d6-937a-c39f521f6bea-ovn-controller-tls-certs\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.711676 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvtk2\" (UniqueName: \"kubernetes.io/projected/14f970e0-8d42-46d6-937a-c39f521f6bea-kube-api-access-mvtk2\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.711874 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-log-ovn\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.712013 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-run-ovn\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.712104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f970e0-8d42-46d6-937a-c39f521f6bea-combined-ca-bundle\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.717022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-run\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.717100 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-run\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.717334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-etc-ovs\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.717353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-var-lib\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.717496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-run-ovn\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.717938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14f970e0-8d42-46d6-937a-c39f521f6bea-var-log-ovn\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.718983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14f970e0-8d42-46d6-937a-c39f521f6bea-combined-ca-bundle\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.719504 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14f970e0-8d42-46d6-937a-c39f521f6bea-scripts\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.720530 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-scripts\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.732375 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/14f970e0-8d42-46d6-937a-c39f521f6bea-ovn-controller-tls-certs\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.742972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88df\" (UniqueName: \"kubernetes.io/projected/802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6-kube-api-access-b88df\") pod \"ovn-controller-ovs-cp8xg\" (UID: \"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6\") " pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.745523 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvtk2\" (UniqueName: \"kubernetes.io/projected/14f970e0-8d42-46d6-937a-c39f521f6bea-kube-api-access-mvtk2\") pod \"ovn-controller-j4tpc\" (UID: \"14f970e0-8d42-46d6-937a-c39f521f6bea\") " pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.762557 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.806118 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc544b19-4b52-46ca-9c0b-518f78ebb47b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813070 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cc544b19-4b52-46ca-9c0b-518f78ebb47b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813236 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813593 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813670 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v724\" (UniqueName: \"kubernetes.io/projected/cc544b19-4b52-46ca-9c0b-518f78ebb47b-kube-api-access-5v724\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.813699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc544b19-4b52-46ca-9c0b-518f78ebb47b-config\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc544b19-4b52-46ca-9c0b-518f78ebb47b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cc544b19-4b52-46ca-9c0b-518f78ebb47b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914857 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914886 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v724\" (UniqueName: \"kubernetes.io/projected/cc544b19-4b52-46ca-9c0b-518f78ebb47b-kube-api-access-5v724\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.914901 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc544b19-4b52-46ca-9c0b-518f78ebb47b-config\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.915658 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc544b19-4b52-46ca-9c0b-518f78ebb47b-config\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.917018 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc544b19-4b52-46ca-9c0b-518f78ebb47b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.917075 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.917149 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cc544b19-4b52-46ca-9c0b-518f78ebb47b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.921411 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.921538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.930577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc544b19-4b52-46ca-9c0b-518f78ebb47b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.939170 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v724\" (UniqueName: \"kubernetes.io/projected/cc544b19-4b52-46ca-9c0b-518f78ebb47b-kube-api-access-5v724\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.953533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cc544b19-4b52-46ca-9c0b-518f78ebb47b\") " pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:27 crc kubenswrapper[4837]: I1014 13:15:27.988533 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.449745 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.453386 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.456245 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.456474 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zqdmr" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.456666 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.457098 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.465184 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f5a204e-7b4c-41c2-8d69-e93d3c986249-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556242 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f5a204e-7b4c-41c2-8d69-e93d3c986249-config\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556329 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556398 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/7f5a204e-7b4c-41c2-8d69-e93d3c986249-kube-api-access-rpzt9\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556438 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f5a204e-7b4c-41c2-8d69-e93d3c986249-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556648 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.556681 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.660842 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f5a204e-7b4c-41c2-8d69-e93d3c986249-config\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.660916 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.660960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/7f5a204e-7b4c-41c2-8d69-e93d3c986249-kube-api-access-rpzt9\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.660993 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f5a204e-7b4c-41c2-8d69-e93d3c986249-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.661052 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.661122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.661181 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.661257 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f5a204e-7b4c-41c2-8d69-e93d3c986249-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.661883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f5a204e-7b4c-41c2-8d69-e93d3c986249-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.662398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f5a204e-7b4c-41c2-8d69-e93d3c986249-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.662931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f5a204e-7b4c-41c2-8d69-e93d3c986249-config\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.663057 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.670778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.675065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.678312 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5a204e-7b4c-41c2-8d69-e93d3c986249-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.683115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.692020 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzt9\" (UniqueName: \"kubernetes.io/projected/7f5a204e-7b4c-41c2-8d69-e93d3c986249-kube-api-access-rpzt9\") pod \"ovsdbserver-sb-0\" (UID: \"7f5a204e-7b4c-41c2-8d69-e93d3c986249\") " pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:30 crc kubenswrapper[4837]: I1014 13:15:30.804232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:34 crc kubenswrapper[4837]: I1014 13:15:34.289829 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.069062 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod162a8777_0979_4087_959a_98cd20678758.slice/crio-f610474c0228fc125f1604d0e6880a70d6503d6caab4b37c79ada35bb3b96975 WatchSource:0}: Error finding container f610474c0228fc125f1604d0e6880a70d6503d6caab4b37c79ada35bb3b96975: Status 404 returned error can't find the container with id f610474c0228fc125f1604d0e6880a70d6503d6caab4b37c79ada35bb3b96975 Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.072718 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:15:35 crc kubenswrapper[4837]: E1014 13:15:35.217781 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 14 13:15:35 crc kubenswrapper[4837]: E1014 13:15:35.218153 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n94c8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-7x5lf_openstack(f852e053-d91f-485f-8b38-7545a1618c5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:15:35 crc kubenswrapper[4837]: E1014 13:15:35.219340 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" podUID="f852e053-d91f-485f-8b38-7545a1618c5e" Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.518977 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.532761 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: E1014 13:15:35.585942 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 14 13:15:35 crc kubenswrapper[4837]: E1014 13:15:35.586405 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-94gmw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tnmsl_openstack(37b086f0-09e1-4f81-a31e-032c4c418040): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:15:35 crc kubenswrapper[4837]: E1014 13:15:35.588192 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" podUID="37b086f0-09e1-4f81-a31e-032c4c418040" Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.674374 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fcd80e_9aec_4608_bfc5_653c443d1849.slice/crio-a95f1d8e90564fe92217504f036261379d63cf6d56ceb6f89d2522a4543a4d63 WatchSource:0}: Error finding container a95f1d8e90564fe92217504f036261379d63cf6d56ceb6f89d2522a4543a4d63: Status 404 returned error can't find the container with id a95f1d8e90564fe92217504f036261379d63cf6d56ceb6f89d2522a4543a4d63 Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.678738 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6877e694_37ca_4cd4_ba01_3101d4f7ade4.slice/crio-22f59a257e6fa9c7ee430af21708087e95418104397f897cabe56d52fa9fcdc7 WatchSource:0}: Error finding container 22f59a257e6fa9c7ee430af21708087e95418104397f897cabe56d52fa9fcdc7: Status 404 returned error can't find the container with id 22f59a257e6fa9c7ee430af21708087e95418104397f897cabe56d52fa9fcdc7 Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.705257 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.730441 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.735503 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod561b6993_3913_4d1a_89d8_c146f5e2bd6a.slice/crio-09ef5f60d18414704094a910cf86c075c41a8321829f4583de40095b9650c39f WatchSource:0}: Error finding container 09ef5f60d18414704094a910cf86c075c41a8321829f4583de40095b9650c39f: Status 404 returned error can't find the container with id 09ef5f60d18414704094a910cf86c075c41a8321829f4583de40095b9650c39f Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.818102 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.833504 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5a204e_7b4c_41c2_8d69_e93d3c986249.slice/crio-a60bad93d64ce8c323940d7aedd8aa6066421d00abc8fd4860d5fd59183ca4e7 WatchSource:0}: Error finding container a60bad93d64ce8c323940d7aedd8aa6066421d00abc8fd4860d5fd59183ca4e7: Status 404 returned error can't find the container with id a60bad93d64ce8c323940d7aedd8aa6066421d00abc8fd4860d5fd59183ca4e7 Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.870499 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.878015 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14f970e0_8d42_46d6_937a_c39f521f6bea.slice/crio-a5d0ac203d2c5092f64f76ff8a66c5345ef96040d3b88489ddfb815cc3f905f2 WatchSource:0}: Error finding container a5d0ac203d2c5092f64f76ff8a66c5345ef96040d3b88489ddfb815cc3f905f2: Status 404 returned error can't find the container with id a5d0ac203d2c5092f64f76ff8a66c5345ef96040d3b88489ddfb815cc3f905f2 Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.883874 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j4tpc"] Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.891830 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0454524b_c83b_4049_ad05_8b29a317bc91.slice/crio-405f3e7e7d95e71aa2a519b85d2355a0bffa3ae6fa00512c36362bb35323d767 WatchSource:0}: Error finding container 405f3e7e7d95e71aa2a519b85d2355a0bffa3ae6fa00512c36362bb35323d767: Status 404 returned error can't find the container with id 405f3e7e7d95e71aa2a519b85d2355a0bffa3ae6fa00512c36362bb35323d767 Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.930053 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.935074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"162a8777-0979-4087-959a-98cd20678758","Type":"ContainerStarted","Data":"f610474c0228fc125f1604d0e6880a70d6503d6caab4b37c79ada35bb3b96975"} Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.936158 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0454524b-c83b-4049-ad05-8b29a317bc91","Type":"ContainerStarted","Data":"405f3e7e7d95e71aa2a519b85d2355a0bffa3ae6fa00512c36362bb35323d767"} Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.938121 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"561b6993-3913-4d1a-89d8-c146f5e2bd6a","Type":"ContainerStarted","Data":"09ef5f60d18414704094a910cf86c075c41a8321829f4583de40095b9650c39f"} Oct 14 13:15:35 crc kubenswrapper[4837]: W1014 13:15:35.938504 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc544b19_4b52_46ca_9c0b_518f78ebb47b.slice/crio-84819ce60ebb3e9fb20591928dbd55314f40c8da5087f46330490e97da8cdbba WatchSource:0}: Error finding container 84819ce60ebb3e9fb20591928dbd55314f40c8da5087f46330490e97da8cdbba: Status 404 returned error can't find the container with id 84819ce60ebb3e9fb20591928dbd55314f40c8da5087f46330490e97da8cdbba Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.939575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0fcd80e-9aec-4608-bfc5-653c443d1849","Type":"ContainerStarted","Data":"a95f1d8e90564fe92217504f036261379d63cf6d56ceb6f89d2522a4543a4d63"} Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.941607 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7f5a204e-7b4c-41c2-8d69-e93d3c986249","Type":"ContainerStarted","Data":"a60bad93d64ce8c323940d7aedd8aa6066421d00abc8fd4860d5fd59183ca4e7"} Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.945073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j4tpc" event={"ID":"14f970e0-8d42-46d6-937a-c39f521f6bea","Type":"ContainerStarted","Data":"a5d0ac203d2c5092f64f76ff8a66c5345ef96040d3b88489ddfb815cc3f905f2"} Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.946486 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11ead61d-f315-4ee0-9dcb-a222012a9c36","Type":"ContainerStarted","Data":"e0ff2612dad0e914b33b56e0cdaa94626f5b20a6bf4326721c38af4c88296af2"} Oct 14 13:15:35 crc kubenswrapper[4837]: I1014 13:15:35.948702 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6877e694-37ca-4cd4-ba01-3101d4f7ade4","Type":"ContainerStarted","Data":"22f59a257e6fa9c7ee430af21708087e95418104397f897cabe56d52fa9fcdc7"} Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.359377 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.360794 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.471107 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n94c8\" (UniqueName: \"kubernetes.io/projected/f852e053-d91f-485f-8b38-7545a1618c5e-kube-api-access-n94c8\") pod \"f852e053-d91f-485f-8b38-7545a1618c5e\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.471190 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-config\") pod \"37b086f0-09e1-4f81-a31e-032c4c418040\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.471305 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f852e053-d91f-485f-8b38-7545a1618c5e-config\") pod \"f852e053-d91f-485f-8b38-7545a1618c5e\" (UID: \"f852e053-d91f-485f-8b38-7545a1618c5e\") " Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.471326 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-dns-svc\") pod \"37b086f0-09e1-4f81-a31e-032c4c418040\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.471396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94gmw\" (UniqueName: \"kubernetes.io/projected/37b086f0-09e1-4f81-a31e-032c4c418040-kube-api-access-94gmw\") pod \"37b086f0-09e1-4f81-a31e-032c4c418040\" (UID: \"37b086f0-09e1-4f81-a31e-032c4c418040\") " Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.474009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f852e053-d91f-485f-8b38-7545a1618c5e-config" (OuterVolumeSpecName: "config") pod "f852e053-d91f-485f-8b38-7545a1618c5e" (UID: "f852e053-d91f-485f-8b38-7545a1618c5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.474019 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-config" (OuterVolumeSpecName: "config") pod "37b086f0-09e1-4f81-a31e-032c4c418040" (UID: "37b086f0-09e1-4f81-a31e-032c4c418040"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.474452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37b086f0-09e1-4f81-a31e-032c4c418040" (UID: "37b086f0-09e1-4f81-a31e-032c4c418040"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.476633 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b086f0-09e1-4f81-a31e-032c4c418040-kube-api-access-94gmw" (OuterVolumeSpecName: "kube-api-access-94gmw") pod "37b086f0-09e1-4f81-a31e-032c4c418040" (UID: "37b086f0-09e1-4f81-a31e-032c4c418040"). InnerVolumeSpecName "kube-api-access-94gmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.478368 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f852e053-d91f-485f-8b38-7545a1618c5e-kube-api-access-n94c8" (OuterVolumeSpecName: "kube-api-access-n94c8") pod "f852e053-d91f-485f-8b38-7545a1618c5e" (UID: "f852e053-d91f-485f-8b38-7545a1618c5e"). InnerVolumeSpecName "kube-api-access-n94c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.575088 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f852e053-d91f-485f-8b38-7545a1618c5e-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.575129 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.575144 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94gmw\" (UniqueName: \"kubernetes.io/projected/37b086f0-09e1-4f81-a31e-032c4c418040-kube-api-access-94gmw\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.575160 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n94c8\" (UniqueName: \"kubernetes.io/projected/f852e053-d91f-485f-8b38-7545a1618c5e-kube-api-access-n94c8\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.575187 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b086f0-09e1-4f81-a31e-032c4c418040-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.621285 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cp8xg"] Oct 14 13:15:36 crc kubenswrapper[4837]: W1014 13:15:36.631940 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod802ad4c4_c2e5_4c95_88ef_950d8f1fbdf6.slice/crio-48be6d4b1c19ec001b88a8c9c6428431d50d4af1a89e526132f733f2cd47a478 WatchSource:0}: Error finding container 48be6d4b1c19ec001b88a8c9c6428431d50d4af1a89e526132f733f2cd47a478: Status 404 returned error can't find the container with id 48be6d4b1c19ec001b88a8c9c6428431d50d4af1a89e526132f733f2cd47a478 Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.957566 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" event={"ID":"f852e053-d91f-485f-8b38-7545a1618c5e","Type":"ContainerDied","Data":"1c5dc872915180ceb271981ff90f90faddaccadf3ce37c583ead60c9244b89cf"} Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.957630 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-7x5lf" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.960499 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cc544b19-4b52-46ca-9c0b-518f78ebb47b","Type":"ContainerStarted","Data":"84819ce60ebb3e9fb20591928dbd55314f40c8da5087f46330490e97da8cdbba"} Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.965000 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.965029 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tnmsl" event={"ID":"37b086f0-09e1-4f81-a31e-032c4c418040","Type":"ContainerDied","Data":"3e637db39dcc6db68c17d6c8baff5bbdd8c5a3025f24439fa6562165d719e15f"} Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.967250 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cp8xg" event={"ID":"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6","Type":"ContainerStarted","Data":"48be6d4b1c19ec001b88a8c9c6428431d50d4af1a89e526132f733f2cd47a478"} Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.970325 4837 generic.go:334] "Generic (PLEG): container finished" podID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerID="afd13dec87e53073329562d854e37abdf2286f524d777b60cc343cbf615417cd" exitCode=0 Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.970398 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" event={"ID":"b029d6b5-4398-42bf-abb7-30bd70de9142","Type":"ContainerDied","Data":"afd13dec87e53073329562d854e37abdf2286f524d777b60cc343cbf615417cd"} Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.973408 4837 generic.go:334] "Generic (PLEG): container finished" podID="ee5dcc94-8303-441c-b231-7a619a26c732" containerID="5df31765692767e92b3abbee0f9a592b4ee393dff582602ce21c25cf34fa0f7e" exitCode=0 Oct 14 13:15:36 crc kubenswrapper[4837]: I1014 13:15:36.973514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" event={"ID":"ee5dcc94-8303-441c-b231-7a619a26c732","Type":"ContainerDied","Data":"5df31765692767e92b3abbee0f9a592b4ee393dff582602ce21c25cf34fa0f7e"} Oct 14 13:15:37 crc kubenswrapper[4837]: I1014 13:15:37.017315 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7x5lf"] Oct 14 13:15:37 crc kubenswrapper[4837]: I1014 13:15:37.017376 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-7x5lf"] Oct 14 13:15:37 crc kubenswrapper[4837]: I1014 13:15:37.041931 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tnmsl"] Oct 14 13:15:37 crc kubenswrapper[4837]: I1014 13:15:37.051573 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tnmsl"] Oct 14 13:15:38 crc kubenswrapper[4837]: I1014 13:15:38.794779 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b086f0-09e1-4f81-a31e-032c4c418040" path="/var/lib/kubelet/pods/37b086f0-09e1-4f81-a31e-032c4c418040/volumes" Oct 14 13:15:38 crc kubenswrapper[4837]: I1014 13:15:38.795254 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f852e053-d91f-485f-8b38-7545a1618c5e" path="/var/lib/kubelet/pods/f852e053-d91f-485f-8b38-7545a1618c5e/volumes" Oct 14 13:15:38 crc kubenswrapper[4837]: I1014 13:15:38.993810 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" event={"ID":"b029d6b5-4398-42bf-abb7-30bd70de9142","Type":"ContainerStarted","Data":"2767bf5c92185163626f471585ef7dfa566e833a02bb3f4ec4d09fe9787ae0c5"} Oct 14 13:15:38 crc kubenswrapper[4837]: I1014 13:15:38.993939 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:39 crc kubenswrapper[4837]: I1014 13:15:39.000622 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" event={"ID":"ee5dcc94-8303-441c-b231-7a619a26c732","Type":"ContainerStarted","Data":"70516738753a6cf8c36cdbd9ee5a0c4d8053d5b844f429016c94ec6098cb252b"} Oct 14 13:15:39 crc kubenswrapper[4837]: I1014 13:15:39.000757 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:39 crc kubenswrapper[4837]: I1014 13:15:39.018809 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" podStartSLOduration=4.421323613 podStartE2EDuration="22.018787797s" podCreationTimestamp="2025-10-14 13:15:17 +0000 UTC" firstStartedPulling="2025-10-14 13:15:18.202527892 +0000 UTC m=+856.119527705" lastFinishedPulling="2025-10-14 13:15:35.799992066 +0000 UTC m=+873.716991889" observedRunningTime="2025-10-14 13:15:39.016860064 +0000 UTC m=+876.933859877" watchObservedRunningTime="2025-10-14 13:15:39.018787797 +0000 UTC m=+876.935787620" Oct 14 13:15:39 crc kubenswrapper[4837]: I1014 13:15:39.038694 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" podStartSLOduration=4.344167094 podStartE2EDuration="22.038673931s" podCreationTimestamp="2025-10-14 13:15:17 +0000 UTC" firstStartedPulling="2025-10-14 13:15:18.019195359 +0000 UTC m=+855.936195172" lastFinishedPulling="2025-10-14 13:15:35.713702196 +0000 UTC m=+873.630702009" observedRunningTime="2025-10-14 13:15:39.035191795 +0000 UTC m=+876.952191628" watchObservedRunningTime="2025-10-14 13:15:39.038673931 +0000 UTC m=+876.955673754" Oct 14 13:15:41 crc kubenswrapper[4837]: I1014 13:15:41.140704 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:15:41 crc kubenswrapper[4837]: I1014 13:15:41.140773 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:15:42 crc kubenswrapper[4837]: I1014 13:15:42.549772 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:42 crc kubenswrapper[4837]: I1014 13:15:42.758135 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:15:42 crc kubenswrapper[4837]: I1014 13:15:42.813233 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-drzz4"] Oct 14 13:15:43 crc kubenswrapper[4837]: I1014 13:15:43.031776 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" containerName="dnsmasq-dns" containerID="cri-o://70516738753a6cf8c36cdbd9ee5a0c4d8053d5b844f429016c94ec6098cb252b" gracePeriod=10 Oct 14 13:15:44 crc kubenswrapper[4837]: I1014 13:15:44.038339 4837 generic.go:334] "Generic (PLEG): container finished" podID="ee5dcc94-8303-441c-b231-7a619a26c732" containerID="70516738753a6cf8c36cdbd9ee5a0c4d8053d5b844f429016c94ec6098cb252b" exitCode=0 Oct 14 13:15:44 crc kubenswrapper[4837]: I1014 13:15:44.038411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" event={"ID":"ee5dcc94-8303-441c-b231-7a619a26c732","Type":"ContainerDied","Data":"70516738753a6cf8c36cdbd9ee5a0c4d8053d5b844f429016c94ec6098cb252b"} Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.387510 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.549881 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-config\") pod \"ee5dcc94-8303-441c-b231-7a619a26c732\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.549952 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-dns-svc\") pod \"ee5dcc94-8303-441c-b231-7a619a26c732\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.550066 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9m8k\" (UniqueName: \"kubernetes.io/projected/ee5dcc94-8303-441c-b231-7a619a26c732-kube-api-access-g9m8k\") pod \"ee5dcc94-8303-441c-b231-7a619a26c732\" (UID: \"ee5dcc94-8303-441c-b231-7a619a26c732\") " Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.555859 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5dcc94-8303-441c-b231-7a619a26c732-kube-api-access-g9m8k" (OuterVolumeSpecName: "kube-api-access-g9m8k") pod "ee5dcc94-8303-441c-b231-7a619a26c732" (UID: "ee5dcc94-8303-441c-b231-7a619a26c732"). InnerVolumeSpecName "kube-api-access-g9m8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.589687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-config" (OuterVolumeSpecName: "config") pod "ee5dcc94-8303-441c-b231-7a619a26c732" (UID: "ee5dcc94-8303-441c-b231-7a619a26c732"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.590694 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee5dcc94-8303-441c-b231-7a619a26c732" (UID: "ee5dcc94-8303-441c-b231-7a619a26c732"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.651681 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.651730 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9m8k\" (UniqueName: \"kubernetes.io/projected/ee5dcc94-8303-441c-b231-7a619a26c732-kube-api-access-g9m8k\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:45 crc kubenswrapper[4837]: I1014 13:15:45.651750 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5dcc94-8303-441c-b231-7a619a26c732-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.054120 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" event={"ID":"ee5dcc94-8303-441c-b231-7a619a26c732","Type":"ContainerDied","Data":"380fc70f745822768800351a2552560c61dafd7977668f4a5fd5255f901c799f"} Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.054199 4837 scope.go:117] "RemoveContainer" containerID="70516738753a6cf8c36cdbd9ee5a0c4d8053d5b844f429016c94ec6098cb252b" Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.054215 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-drzz4" Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.085926 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-drzz4"] Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.091508 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-drzz4"] Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.098423 4837 scope.go:117] "RemoveContainer" containerID="5df31765692767e92b3abbee0f9a592b4ee393dff582602ce21c25cf34fa0f7e" Oct 14 13:15:46 crc kubenswrapper[4837]: I1014 13:15:46.795993 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" path="/var/lib/kubelet/pods/ee5dcc94-8303-441c-b231-7a619a26c732/volumes" Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.066740 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"162a8777-0979-4087-959a-98cd20678758","Type":"ContainerStarted","Data":"4278fe50903b48166d5f52eaae8ee8e772e0185ef615b050b219509983a03c50"} Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.077858 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cc544b19-4b52-46ca-9c0b-518f78ebb47b","Type":"ContainerStarted","Data":"13ab31a4e35e2d83940ddb1f74edb74185cd40ed27fd60b0030d8e1a26317c14"} Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.078601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0454524b-c83b-4049-ad05-8b29a317bc91","Type":"ContainerStarted","Data":"1b1d5921eb14284caaa780892a07790fe57c02407f5ccf35b08ee43be8873696"} Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.079252 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.080770 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7f5a204e-7b4c-41c2-8d69-e93d3c986249","Type":"ContainerStarted","Data":"cc178b2672540bb81eb1ebc876c3bed834ee59938445d4620971eed494a9df74"} Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.082478 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11ead61d-f315-4ee0-9dcb-a222012a9c36","Type":"ContainerStarted","Data":"8ee58bf36785879f3f1fe8860c61c188d7e907b4375513d56402db155c92c9d0"} Oct 14 13:15:47 crc kubenswrapper[4837]: I1014 13:15:47.141625 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.52187788 podStartE2EDuration="26.141599578s" podCreationTimestamp="2025-10-14 13:15:21 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.901730707 +0000 UTC m=+873.818730520" lastFinishedPulling="2025-10-14 13:15:45.521452405 +0000 UTC m=+883.438452218" observedRunningTime="2025-10-14 13:15:47.138296948 +0000 UTC m=+885.055296771" watchObservedRunningTime="2025-10-14 13:15:47.141599578 +0000 UTC m=+885.058599391" Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.094627 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6877e694-37ca-4cd4-ba01-3101d4f7ade4","Type":"ContainerStarted","Data":"46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce"} Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.099366 4837 generic.go:334] "Generic (PLEG): container finished" podID="802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6" containerID="329f3d7c5dfdd068cde654d7ec338341a9cf109045c8d0f448ae2256afea7fdb" exitCode=0 Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.099492 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cp8xg" event={"ID":"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6","Type":"ContainerDied","Data":"329f3d7c5dfdd068cde654d7ec338341a9cf109045c8d0f448ae2256afea7fdb"} Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.102109 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"561b6993-3913-4d1a-89d8-c146f5e2bd6a","Type":"ContainerStarted","Data":"172a03390e7002c502ec2b291ec110442330b698578d9984e74b70c1b89c66a8"} Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.102230 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.105020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0fcd80e-9aec-4608-bfc5-653c443d1849","Type":"ContainerStarted","Data":"3f5e3ffe8a2b7184a62f038431da9b3670de00f0a32b5022a406da34cfe945a7"} Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.107826 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j4tpc" event={"ID":"14f970e0-8d42-46d6-937a-c39f521f6bea","Type":"ContainerStarted","Data":"ddce01bb9ae043b459c127ea6eaafa0dd17c113b9c6442b639954f1a47a13ec8"} Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.108416 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-j4tpc" Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.143665 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.658099635 podStartE2EDuration="25.143646324s" podCreationTimestamp="2025-10-14 13:15:23 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.765596325 +0000 UTC m=+873.682596138" lastFinishedPulling="2025-10-14 13:15:46.251143014 +0000 UTC m=+884.168142827" observedRunningTime="2025-10-14 13:15:48.128114699 +0000 UTC m=+886.045114512" watchObservedRunningTime="2025-10-14 13:15:48.143646324 +0000 UTC m=+886.060646137" Oct 14 13:15:48 crc kubenswrapper[4837]: I1014 13:15:48.220558 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-j4tpc" podStartSLOduration=11.511776793 podStartE2EDuration="21.220539726s" podCreationTimestamp="2025-10-14 13:15:27 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.880122806 +0000 UTC m=+873.797122619" lastFinishedPulling="2025-10-14 13:15:45.588885739 +0000 UTC m=+883.505885552" observedRunningTime="2025-10-14 13:15:48.216826605 +0000 UTC m=+886.133826428" watchObservedRunningTime="2025-10-14 13:15:48.220539726 +0000 UTC m=+886.137539539" Oct 14 13:15:49 crc kubenswrapper[4837]: I1014 13:15:49.118947 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cp8xg" event={"ID":"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6","Type":"ContainerStarted","Data":"b5105add8fa240e8397b46e9543c525b0d95f024dac9bf68c12de6b4155af7fc"} Oct 14 13:15:49 crc kubenswrapper[4837]: I1014 13:15:49.119342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cp8xg" event={"ID":"802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6","Type":"ContainerStarted","Data":"b303f7a32923d38a4259a0e49c0e6f81bd7789abe94f8799156164cc617fe864"} Oct 14 13:15:49 crc kubenswrapper[4837]: I1014 13:15:49.119643 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:49 crc kubenswrapper[4837]: I1014 13:15:49.146106 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cp8xg" podStartSLOduration=13.515945867 podStartE2EDuration="22.14608041s" podCreationTimestamp="2025-10-14 13:15:27 +0000 UTC" firstStartedPulling="2025-10-14 13:15:36.634730667 +0000 UTC m=+874.551730480" lastFinishedPulling="2025-10-14 13:15:45.26486522 +0000 UTC m=+883.181865023" observedRunningTime="2025-10-14 13:15:49.13658822 +0000 UTC m=+887.053588073" watchObservedRunningTime="2025-10-14 13:15:49.14608041 +0000 UTC m=+887.063080263" Oct 14 13:15:50 crc kubenswrapper[4837]: I1014 13:15:50.126267 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.137958 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7f5a204e-7b4c-41c2-8d69-e93d3c986249","Type":"ContainerStarted","Data":"3555da87fbd09fa493f26e5d3f90f9842f5fca193413f9c03d474f7dbe8cc44c"} Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.140487 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cc544b19-4b52-46ca-9c0b-518f78ebb47b","Type":"ContainerStarted","Data":"b6df67b17e837de3a766a52fd66307d98c282b1eb4b9be4190d3a1f172bd2b06"} Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.170706 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.676165689 podStartE2EDuration="22.170682251s" podCreationTimestamp="2025-10-14 13:15:29 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.835912287 +0000 UTC m=+873.752912100" lastFinishedPulling="2025-10-14 13:15:50.330428849 +0000 UTC m=+888.247428662" observedRunningTime="2025-10-14 13:15:51.162751315 +0000 UTC m=+889.079751178" watchObservedRunningTime="2025-10-14 13:15:51.170682251 +0000 UTC m=+889.087682104" Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.189727 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.771515716 podStartE2EDuration="25.189707791s" podCreationTimestamp="2025-10-14 13:15:26 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.941575896 +0000 UTC m=+873.858575709" lastFinishedPulling="2025-10-14 13:15:50.359767971 +0000 UTC m=+888.276767784" observedRunningTime="2025-10-14 13:15:51.186109093 +0000 UTC m=+889.103108916" watchObservedRunningTime="2025-10-14 13:15:51.189707791 +0000 UTC m=+889.106707604" Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.805720 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.856420 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.864904 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:51 crc kubenswrapper[4837]: I1014 13:15:51.989786 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.027730 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.150801 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.150856 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.208005 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.209272 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.504393 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-rkfkw"] Oct 14 13:15:52 crc kubenswrapper[4837]: E1014 13:15:52.504782 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" containerName="dnsmasq-dns" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.504804 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" containerName="dnsmasq-dns" Oct 14 13:15:52 crc kubenswrapper[4837]: E1014 13:15:52.504825 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" containerName="init" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.504833 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" containerName="init" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.505012 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5dcc94-8303-441c-b231-7a619a26c732" containerName="dnsmasq-dns" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.505968 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.511839 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.519049 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-rkfkw"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.670494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djktr\" (UniqueName: \"kubernetes.io/projected/5d21e444-c515-4894-9d45-4666a8a22b81-kube-api-access-djktr\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.670649 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.670705 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-config\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.671004 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.735695 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vj9cc"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.736887 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.738535 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.746402 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vj9cc"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.776049 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.776108 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-config\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.776196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.776250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djktr\" (UniqueName: \"kubernetes.io/projected/5d21e444-c515-4894-9d45-4666a8a22b81-kube-api-access-djktr\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.777284 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.777320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.777295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-config\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.806097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djktr\" (UniqueName: \"kubernetes.io/projected/5d21e444-c515-4894-9d45-4666a8a22b81-kube-api-access-djktr\") pod \"dnsmasq-dns-6bc7876d45-rkfkw\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.831823 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.839435 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.856033 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.858615 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-swpbp" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.858812 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.858939 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.859054 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.864275 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-rkfkw"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.872468 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.877122 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43da7026-edd3-4f7f-9944-1aff537446a0-config\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.877182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7h7w\" (UniqueName: \"kubernetes.io/projected/43da7026-edd3-4f7f-9944-1aff537446a0-kube-api-access-w7h7w\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.877241 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43da7026-edd3-4f7f-9944-1aff537446a0-ovn-rundir\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.877276 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43da7026-edd3-4f7f-9944-1aff537446a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.877310 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da7026-edd3-4f7f-9944-1aff537446a0-combined-ca-bundle\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.877340 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43da7026-edd3-4f7f-9944-1aff537446a0-ovs-rundir\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.887696 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-zll5f"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.898939 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.903902 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.935797 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zll5f"] Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.982932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43da7026-edd3-4f7f-9944-1aff537446a0-config\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.982969 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7h7w\" (UniqueName: \"kubernetes.io/projected/43da7026-edd3-4f7f-9944-1aff537446a0-kube-api-access-w7h7w\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.982996 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f648709-678d-4844-8571-ac5c5c5712a3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983026 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f648709-678d-4844-8571-ac5c5c5712a3-scripts\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983055 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-config\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983071 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43da7026-edd3-4f7f-9944-1aff537446a0-ovn-rundir\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f648709-678d-4844-8571-ac5c5c5712a3-config\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983374 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-dns-svc\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983403 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcgn6\" (UniqueName: \"kubernetes.io/projected/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-kube-api-access-rcgn6\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983420 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983439 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43da7026-edd3-4f7f-9944-1aff537446a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983471 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzw4\" (UniqueName: \"kubernetes.io/projected/3f648709-678d-4844-8571-ac5c5c5712a3-kube-api-access-fwzw4\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da7026-edd3-4f7f-9944-1aff537446a0-combined-ca-bundle\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983585 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43da7026-edd3-4f7f-9944-1aff537446a0-ovs-rundir\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/43da7026-edd3-4f7f-9944-1aff537446a0-ovn-rundir\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.983846 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/43da7026-edd3-4f7f-9944-1aff537446a0-ovs-rundir\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.984249 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43da7026-edd3-4f7f-9944-1aff537446a0-config\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.989478 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43da7026-edd3-4f7f-9944-1aff537446a0-combined-ca-bundle\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:52 crc kubenswrapper[4837]: I1014 13:15:52.990238 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/43da7026-edd3-4f7f-9944-1aff537446a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.005486 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7h7w\" (UniqueName: \"kubernetes.io/projected/43da7026-edd3-4f7f-9944-1aff537446a0-kube-api-access-w7h7w\") pod \"ovn-controller-metrics-vj9cc\" (UID: \"43da7026-edd3-4f7f-9944-1aff537446a0\") " pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.055876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vj9cc" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085565 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcgn6\" (UniqueName: \"kubernetes.io/projected/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-kube-api-access-rcgn6\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzw4\" (UniqueName: \"kubernetes.io/projected/3f648709-678d-4844-8571-ac5c5c5712a3-kube-api-access-fwzw4\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085647 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f648709-678d-4844-8571-ac5c5c5712a3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f648709-678d-4844-8571-ac5c5c5712a3-scripts\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-config\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f648709-678d-4844-8571-ac5c5c5712a3-config\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.085945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-dns-svc\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.087011 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f648709-678d-4844-8571-ac5c5c5712a3-scripts\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.087039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-dns-svc\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.087332 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f648709-678d-4844-8571-ac5c5c5712a3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.087384 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.088445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f648709-678d-4844-8571-ac5c5c5712a3-config\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.088453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.089496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-config\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.101605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.101993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.102316 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f648709-678d-4844-8571-ac5c5c5712a3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.103680 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzw4\" (UniqueName: \"kubernetes.io/projected/3f648709-678d-4844-8571-ac5c5c5712a3-kube-api-access-fwzw4\") pod \"ovn-northd-0\" (UID: \"3f648709-678d-4844-8571-ac5c5c5712a3\") " pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.104738 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcgn6\" (UniqueName: \"kubernetes.io/projected/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-kube-api-access-rcgn6\") pod \"dnsmasq-dns-8554648995-zll5f\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.162150 4837 generic.go:334] "Generic (PLEG): container finished" podID="11ead61d-f315-4ee0-9dcb-a222012a9c36" containerID="8ee58bf36785879f3f1fe8860c61c188d7e907b4375513d56402db155c92c9d0" exitCode=0 Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.162459 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11ead61d-f315-4ee0-9dcb-a222012a9c36","Type":"ContainerDied","Data":"8ee58bf36785879f3f1fe8860c61c188d7e907b4375513d56402db155c92c9d0"} Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.242734 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.249581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.361123 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-rkfkw"] Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.575415 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vj9cc"] Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.692777 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zll5f"] Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.739802 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.757309 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6svgj"] Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.760949 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.791912 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6svgj"] Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.851012 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.912990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w84r\" (UniqueName: \"kubernetes.io/projected/bdb9c373-ac68-49f9-876d-6835e623ff5f-kube-api-access-2w84r\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.913074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.913105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-config\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.913190 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:53 crc kubenswrapper[4837]: I1014 13:15:53.913262 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.015087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-config\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.015207 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.015268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.015297 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w84r\" (UniqueName: \"kubernetes.io/projected/bdb9c373-ac68-49f9-876d-6835e623ff5f-kube-api-access-2w84r\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.015332 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.016171 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.016786 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-config\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.017349 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.017879 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.029934 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zll5f"] Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.050037 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w84r\" (UniqueName: \"kubernetes.io/projected/bdb9c373-ac68-49f9-876d-6835e623ff5f-kube-api-access-2w84r\") pod \"dnsmasq-dns-b8fbc5445-6svgj\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.090465 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.171262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"11ead61d-f315-4ee0-9dcb-a222012a9c36","Type":"ContainerStarted","Data":"7ba195ddefc389f619aca1c618d649a4b26eda0dc1ed03e504205364558e6649"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.173222 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vj9cc" event={"ID":"43da7026-edd3-4f7f-9944-1aff537446a0","Type":"ContainerStarted","Data":"6c9c07499fea69cc28e8b54d702e481c8e38c0eee28c737f8a85e78b6b1d66dd"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.182003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f648709-678d-4844-8571-ac5c5c5712a3","Type":"ContainerStarted","Data":"884f6cd7b85e58ebeec56c8c3fd9e1ddca5951fe488634e506abc4449fca086a"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.183577 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zll5f" event={"ID":"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31","Type":"ContainerStarted","Data":"f35c53aa0ed82ae81212ab18dbc4a9e05f8de4d74fec680a24d55910fd6d8ba3"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.185287 4837 generic.go:334] "Generic (PLEG): container finished" podID="162a8777-0979-4087-959a-98cd20678758" containerID="4278fe50903b48166d5f52eaae8ee8e772e0185ef615b050b219509983a03c50" exitCode=0 Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.185378 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"162a8777-0979-4087-959a-98cd20678758","Type":"ContainerDied","Data":"4278fe50903b48166d5f52eaae8ee8e772e0185ef615b050b219509983a03c50"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.187679 4837 generic.go:334] "Generic (PLEG): container finished" podID="5d21e444-c515-4894-9d45-4666a8a22b81" containerID="89b71b0345f734a42233e79aaa0a6a450cfd0c2d9852662769776edab56902a7" exitCode=0 Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.188297 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" event={"ID":"5d21e444-c515-4894-9d45-4666a8a22b81","Type":"ContainerDied","Data":"89b71b0345f734a42233e79aaa0a6a450cfd0c2d9852662769776edab56902a7"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.188320 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" event={"ID":"5d21e444-c515-4894-9d45-4666a8a22b81","Type":"ContainerStarted","Data":"e697b5c69750530fa07f0208dcdc5b13f2bb2b3cc33913071995b5ae1449fd47"} Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.203385 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.307277557 podStartE2EDuration="34.203360903s" podCreationTimestamp="2025-10-14 13:15:20 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.765875162 +0000 UTC m=+873.682874975" lastFinishedPulling="2025-10-14 13:15:43.661958508 +0000 UTC m=+881.578958321" observedRunningTime="2025-10-14 13:15:54.19408145 +0000 UTC m=+892.111081283" watchObservedRunningTime="2025-10-14 13:15:54.203360903 +0000 UTC m=+892.120360716" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.551535 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.590977 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6svgj"] Oct 14 13:15:54 crc kubenswrapper[4837]: W1014 13:15:54.596081 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdb9c373_ac68_49f9_876d_6835e623ff5f.slice/crio-20ff5eb59b5151bfddcca7dacee03fe95051ed47cb32f7f443459495e3d81de7 WatchSource:0}: Error finding container 20ff5eb59b5151bfddcca7dacee03fe95051ed47cb32f7f443459495e3d81de7: Status 404 returned error can't find the container with id 20ff5eb59b5151bfddcca7dacee03fe95051ed47cb32f7f443459495e3d81de7 Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.635432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djktr\" (UniqueName: \"kubernetes.io/projected/5d21e444-c515-4894-9d45-4666a8a22b81-kube-api-access-djktr\") pod \"5d21e444-c515-4894-9d45-4666a8a22b81\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.635511 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-config\") pod \"5d21e444-c515-4894-9d45-4666a8a22b81\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.635602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-ovsdbserver-sb\") pod \"5d21e444-c515-4894-9d45-4666a8a22b81\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.635673 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-dns-svc\") pod \"5d21e444-c515-4894-9d45-4666a8a22b81\" (UID: \"5d21e444-c515-4894-9d45-4666a8a22b81\") " Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.639594 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d21e444-c515-4894-9d45-4666a8a22b81-kube-api-access-djktr" (OuterVolumeSpecName: "kube-api-access-djktr") pod "5d21e444-c515-4894-9d45-4666a8a22b81" (UID: "5d21e444-c515-4894-9d45-4666a8a22b81"). InnerVolumeSpecName "kube-api-access-djktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.654665 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d21e444-c515-4894-9d45-4666a8a22b81" (UID: "5d21e444-c515-4894-9d45-4666a8a22b81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.681860 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d21e444-c515-4894-9d45-4666a8a22b81" (UID: "5d21e444-c515-4894-9d45-4666a8a22b81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.689681 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-config" (OuterVolumeSpecName: "config") pod "5d21e444-c515-4894-9d45-4666a8a22b81" (UID: "5d21e444-c515-4894-9d45-4666a8a22b81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.736977 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.737225 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djktr\" (UniqueName: \"kubernetes.io/projected/5d21e444-c515-4894-9d45-4666a8a22b81-kube-api-access-djktr\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.737295 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.737351 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d21e444-c515-4894-9d45-4666a8a22b81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:54 crc kubenswrapper[4837]: E1014 13:15:54.930991 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d21e444_c515_4894_9d45_4666a8a22b81.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d21e444_c515_4894_9d45_4666a8a22b81.slice/crio-e697b5c69750530fa07f0208dcdc5b13f2bb2b3cc33913071995b5ae1449fd47\": RecentStats: unable to find data in memory cache]" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.960091 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 14 13:15:54 crc kubenswrapper[4837]: E1014 13:15:54.960574 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d21e444-c515-4894-9d45-4666a8a22b81" containerName="init" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.960596 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d21e444-c515-4894-9d45-4666a8a22b81" containerName="init" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.960874 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d21e444-c515-4894-9d45-4666a8a22b81" containerName="init" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.970846 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.974753 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.975061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.975081 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jxdwl" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.975844 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 14 13:15:54 crc kubenswrapper[4837]: I1014 13:15:54.982577 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.044484 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-lock\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.044563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-cache\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.044628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.044671 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.044692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8px\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-kube-api-access-7k8px\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.145904 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.145999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.146049 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8px\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-kube-api-access-7k8px\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.146204 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-lock\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.146288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-cache\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.147126 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-cache\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.147631 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: E1014 13:15:55.147642 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 14 13:15:55 crc kubenswrapper[4837]: E1014 13:15:55.147738 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.147761 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-lock\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: E1014 13:15:55.147844 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift podName:d34918e7-1e17-4d1d-a163-4d2f0539f2d7 nodeName:}" failed. No retries permitted until 2025-10-14 13:15:55.647808703 +0000 UTC m=+893.564808556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift") pod "swift-storage-0" (UID: "d34918e7-1e17-4d1d-a163-4d2f0539f2d7") : configmap "swift-ring-files" not found Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.175609 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8px\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-kube-api-access-7k8px\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.196845 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.206218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vj9cc" event={"ID":"43da7026-edd3-4f7f-9944-1aff537446a0","Type":"ContainerStarted","Data":"8505bc6b7fb4cba1ef597bd9291c39f2017771dc599e3356474c2226ba427569"} Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.211515 4837 generic.go:334] "Generic (PLEG): container finished" podID="61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" containerID="7d57b8477eddbc638ca0ca2b9ee1b46bf57a37e6dcddefc4f97f2a03ea7e6842" exitCode=0 Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.211643 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zll5f" event={"ID":"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31","Type":"ContainerDied","Data":"7d57b8477eddbc638ca0ca2b9ee1b46bf57a37e6dcddefc4f97f2a03ea7e6842"} Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.220906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"162a8777-0979-4087-959a-98cd20678758","Type":"ContainerStarted","Data":"a1a2f1935215519858b57922c335742eca592300806201950cf09cc13299f6ea"} Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.227819 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vj9cc" podStartSLOduration=3.227793781 podStartE2EDuration="3.227793781s" podCreationTimestamp="2025-10-14 13:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:15:55.223324878 +0000 UTC m=+893.140324721" watchObservedRunningTime="2025-10-14 13:15:55.227793781 +0000 UTC m=+893.144793614" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.232839 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.233336 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-rkfkw" event={"ID":"5d21e444-c515-4894-9d45-4666a8a22b81","Type":"ContainerDied","Data":"e697b5c69750530fa07f0208dcdc5b13f2bb2b3cc33913071995b5ae1449fd47"} Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.233423 4837 scope.go:117] "RemoveContainer" containerID="89b71b0345f734a42233e79aaa0a6a450cfd0c2d9852662769776edab56902a7" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.271555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" event={"ID":"bdb9c373-ac68-49f9-876d-6835e623ff5f","Type":"ContainerStarted","Data":"7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae"} Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.271935 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" event={"ID":"bdb9c373-ac68-49f9-876d-6835e623ff5f","Type":"ContainerStarted","Data":"20ff5eb59b5151bfddcca7dacee03fe95051ed47cb32f7f443459495e3d81de7"} Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.364289 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.300503633 podStartE2EDuration="37.364272431s" podCreationTimestamp="2025-10-14 13:15:18 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.072508896 +0000 UTC m=+872.989508709" lastFinishedPulling="2025-10-14 13:15:45.136277704 +0000 UTC m=+883.053277507" observedRunningTime="2025-10-14 13:15:55.35614607 +0000 UTC m=+893.273145893" watchObservedRunningTime="2025-10-14 13:15:55.364272431 +0000 UTC m=+893.281272244" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.439929 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-rkfkw"] Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.446153 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-rkfkw"] Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.531647 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zww9d"] Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.532795 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.536189 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.536931 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.537154 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.541960 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zww9d"] Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.606388 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667488 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-ring-data-devices\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667558 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1960b9c9-0169-447b-a184-21c3522760f8-etc-swift\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-dispersionconf\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-swiftconf\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667720 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-combined-ca-bundle\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8bj\" (UniqueName: \"kubernetes.io/projected/1960b9c9-0169-447b-a184-21c3522760f8-kube-api-access-2n8bj\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.667976 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-scripts\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.668022 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:55 crc kubenswrapper[4837]: E1014 13:15:55.668218 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 14 13:15:55 crc kubenswrapper[4837]: E1014 13:15:55.668236 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 14 13:15:55 crc kubenswrapper[4837]: E1014 13:15:55.668281 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift podName:d34918e7-1e17-4d1d-a163-4d2f0539f2d7 nodeName:}" failed. No retries permitted until 2025-10-14 13:15:56.668268353 +0000 UTC m=+894.585268166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift") pod "swift-storage-0" (UID: "d34918e7-1e17-4d1d-a163-4d2f0539f2d7") : configmap "swift-ring-files" not found Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769164 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-config\") pod \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769277 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcgn6\" (UniqueName: \"kubernetes.io/projected/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-kube-api-access-rcgn6\") pod \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769305 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-sb\") pod \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769353 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-dns-svc\") pod \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769405 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-nb\") pod \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\" (UID: \"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31\") " Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769638 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-scripts\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769695 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-ring-data-devices\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1960b9c9-0169-447b-a184-21c3522760f8-etc-swift\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-dispersionconf\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-swiftconf\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769797 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-combined-ca-bundle\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.769842 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8bj\" (UniqueName: \"kubernetes.io/projected/1960b9c9-0169-447b-a184-21c3522760f8-kube-api-access-2n8bj\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.770489 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-scripts\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.770659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-ring-data-devices\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.770769 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1960b9c9-0169-447b-a184-21c3522760f8-etc-swift\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.774033 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-swiftconf\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.774817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-combined-ca-bundle\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.775082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-dispersionconf\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.776870 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-kube-api-access-rcgn6" (OuterVolumeSpecName: "kube-api-access-rcgn6") pod "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" (UID: "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31"). InnerVolumeSpecName "kube-api-access-rcgn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.788199 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8bj\" (UniqueName: \"kubernetes.io/projected/1960b9c9-0169-447b-a184-21c3522760f8-kube-api-access-2n8bj\") pod \"swift-ring-rebalance-zww9d\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.788849 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" (UID: "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.793909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" (UID: "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.795220 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-config" (OuterVolumeSpecName: "config") pod "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" (UID: "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.797911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" (UID: "61dc58fa-f2ab-43a8-ba9f-e67cb2050a31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.851657 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.871794 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.871830 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcgn6\" (UniqueName: \"kubernetes.io/projected/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-kube-api-access-rcgn6\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.871847 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.871858 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:55 crc kubenswrapper[4837]: I1014 13:15:55.871868 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.294688 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zww9d"] Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.295435 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zll5f" event={"ID":"61dc58fa-f2ab-43a8-ba9f-e67cb2050a31","Type":"ContainerDied","Data":"f35c53aa0ed82ae81212ab18dbc4a9e05f8de4d74fec680a24d55910fd6d8ba3"} Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.295471 4837 scope.go:117] "RemoveContainer" containerID="7d57b8477eddbc638ca0ca2b9ee1b46bf57a37e6dcddefc4f97f2a03ea7e6842" Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.294846 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zll5f" Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.404810 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zll5f"] Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.411945 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zll5f"] Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.690913 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:56 crc kubenswrapper[4837]: E1014 13:15:56.691140 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 14 13:15:56 crc kubenswrapper[4837]: E1014 13:15:56.691195 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 14 13:15:56 crc kubenswrapper[4837]: E1014 13:15:56.691273 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift podName:d34918e7-1e17-4d1d-a163-4d2f0539f2d7 nodeName:}" failed. No retries permitted until 2025-10-14 13:15:58.69124294 +0000 UTC m=+896.608242753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift") pod "swift-storage-0" (UID: "d34918e7-1e17-4d1d-a163-4d2f0539f2d7") : configmap "swift-ring-files" not found Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.793375 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d21e444-c515-4894-9d45-4666a8a22b81" path="/var/lib/kubelet/pods/5d21e444-c515-4894-9d45-4666a8a22b81/volumes" Oct 14 13:15:56 crc kubenswrapper[4837]: I1014 13:15:56.794003 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" path="/var/lib/kubelet/pods/61dc58fa-f2ab-43a8-ba9f-e67cb2050a31/volumes" Oct 14 13:15:57 crc kubenswrapper[4837]: I1014 13:15:57.305546 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zww9d" event={"ID":"1960b9c9-0169-447b-a184-21c3522760f8","Type":"ContainerStarted","Data":"4135403cfce8ea79de057ec35ed0d1235c76e549acf5f6308fd7d59ab838993c"} Oct 14 13:15:57 crc kubenswrapper[4837]: I1014 13:15:57.307598 4837 generic.go:334] "Generic (PLEG): container finished" podID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerID="7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae" exitCode=0 Oct 14 13:15:57 crc kubenswrapper[4837]: I1014 13:15:57.307701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" event={"ID":"bdb9c373-ac68-49f9-876d-6835e623ff5f","Type":"ContainerDied","Data":"7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae"} Oct 14 13:15:58 crc kubenswrapper[4837]: I1014 13:15:58.726523 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:15:58 crc kubenswrapper[4837]: E1014 13:15:58.726737 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 14 13:15:58 crc kubenswrapper[4837]: E1014 13:15:58.726769 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 14 13:15:58 crc kubenswrapper[4837]: E1014 13:15:58.726832 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift podName:d34918e7-1e17-4d1d-a163-4d2f0539f2d7 nodeName:}" failed. No retries permitted until 2025-10-14 13:16:02.726810801 +0000 UTC m=+900.643810614 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift") pod "swift-storage-0" (UID: "d34918e7-1e17-4d1d-a163-4d2f0539f2d7") : configmap "swift-ring-files" not found Oct 14 13:15:59 crc kubenswrapper[4837]: I1014 13:15:59.336408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" event={"ID":"bdb9c373-ac68-49f9-876d-6835e623ff5f","Type":"ContainerStarted","Data":"ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b"} Oct 14 13:15:59 crc kubenswrapper[4837]: I1014 13:15:59.336813 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:15:59 crc kubenswrapper[4837]: I1014 13:15:59.370843 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" podStartSLOduration=6.370812878 podStartE2EDuration="6.370812878s" podCreationTimestamp="2025-10-14 13:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:15:59.361053191 +0000 UTC m=+897.278053034" watchObservedRunningTime="2025-10-14 13:15:59.370812878 +0000 UTC m=+897.287812711" Oct 14 13:16:00 crc kubenswrapper[4837]: I1014 13:16:00.348266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f648709-678d-4844-8571-ac5c5c5712a3","Type":"ContainerStarted","Data":"65779df1a0bfa2a21c5f7775d949ae95cafa79b076d61eda47e8a7212a805375"} Oct 14 13:16:00 crc kubenswrapper[4837]: I1014 13:16:00.555460 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 14 13:16:00 crc kubenswrapper[4837]: I1014 13:16:00.555939 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 14 13:16:01 crc kubenswrapper[4837]: I1014 13:16:01.708870 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 14 13:16:01 crc kubenswrapper[4837]: I1014 13:16:01.708953 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 14 13:16:02 crc kubenswrapper[4837]: I1014 13:16:02.803992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:16:02 crc kubenswrapper[4837]: E1014 13:16:02.804238 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 14 13:16:02 crc kubenswrapper[4837]: E1014 13:16:02.804496 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 14 13:16:02 crc kubenswrapper[4837]: E1014 13:16:02.804573 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift podName:d34918e7-1e17-4d1d-a163-4d2f0539f2d7 nodeName:}" failed. No retries permitted until 2025-10-14 13:16:10.804551273 +0000 UTC m=+908.721551086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift") pod "swift-storage-0" (UID: "d34918e7-1e17-4d1d-a163-4d2f0539f2d7") : configmap "swift-ring-files" not found Oct 14 13:16:03 crc kubenswrapper[4837]: I1014 13:16:03.387176 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f648709-678d-4844-8571-ac5c5c5712a3","Type":"ContainerStarted","Data":"f46fd9ca5709442db8b1ba745f5f3fe85b606403ab4dd809df75d8c766178318"} Oct 14 13:16:03 crc kubenswrapper[4837]: I1014 13:16:03.387513 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 14 13:16:03 crc kubenswrapper[4837]: I1014 13:16:03.417770 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.836590192 podStartE2EDuration="11.417752128s" podCreationTimestamp="2025-10-14 13:15:52 +0000 UTC" firstStartedPulling="2025-10-14 13:15:53.913335213 +0000 UTC m=+891.830335026" lastFinishedPulling="2025-10-14 13:15:59.494497139 +0000 UTC m=+897.411496962" observedRunningTime="2025-10-14 13:16:03.410174401 +0000 UTC m=+901.327174235" watchObservedRunningTime="2025-10-14 13:16:03.417752128 +0000 UTC m=+901.334751941" Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.092941 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.141082 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7v48"] Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.141364 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerName="dnsmasq-dns" containerID="cri-o://2767bf5c92185163626f471585ef7dfa566e833a02bb3f4ec4d09fe9787ae0c5" gracePeriod=10 Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.163356 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.218001 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.396551 4837 generic.go:334] "Generic (PLEG): container finished" podID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerID="2767bf5c92185163626f471585ef7dfa566e833a02bb3f4ec4d09fe9787ae0c5" exitCode=0 Oct 14 13:16:04 crc kubenswrapper[4837]: I1014 13:16:04.396625 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" event={"ID":"b029d6b5-4398-42bf-abb7-30bd70de9142","Type":"ContainerDied","Data":"2767bf5c92185163626f471585ef7dfa566e833a02bb3f4ec4d09fe9787ae0c5"} Oct 14 13:16:04 crc kubenswrapper[4837]: E1014 13:16:04.469812 4837 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.66:44350->38.102.83.66:36943: write tcp 38.102.83.66:44350->38.102.83.66:36943: write: broken pipe Oct 14 13:16:06 crc kubenswrapper[4837]: I1014 13:16:06.838351 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.034047 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-config\") pod \"b029d6b5-4398-42bf-abb7-30bd70de9142\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.034643 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg7bt\" (UniqueName: \"kubernetes.io/projected/b029d6b5-4398-42bf-abb7-30bd70de9142-kube-api-access-sg7bt\") pod \"b029d6b5-4398-42bf-abb7-30bd70de9142\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.034819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-dns-svc\") pod \"b029d6b5-4398-42bf-abb7-30bd70de9142\" (UID: \"b029d6b5-4398-42bf-abb7-30bd70de9142\") " Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.038097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b029d6b5-4398-42bf-abb7-30bd70de9142-kube-api-access-sg7bt" (OuterVolumeSpecName: "kube-api-access-sg7bt") pod "b029d6b5-4398-42bf-abb7-30bd70de9142" (UID: "b029d6b5-4398-42bf-abb7-30bd70de9142"). InnerVolumeSpecName "kube-api-access-sg7bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.068316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b029d6b5-4398-42bf-abb7-30bd70de9142" (UID: "b029d6b5-4398-42bf-abb7-30bd70de9142"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.069307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-config" (OuterVolumeSpecName: "config") pod "b029d6b5-4398-42bf-abb7-30bd70de9142" (UID: "b029d6b5-4398-42bf-abb7-30bd70de9142"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.136480 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.136515 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg7bt\" (UniqueName: \"kubernetes.io/projected/b029d6b5-4398-42bf-abb7-30bd70de9142-kube-api-access-sg7bt\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.136527 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b029d6b5-4398-42bf-abb7-30bd70de9142-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.435692 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zww9d" event={"ID":"1960b9c9-0169-447b-a184-21c3522760f8","Type":"ContainerStarted","Data":"568cf609e7ea36aac64eb14eeb16a24c7e02d4c9cfc7ed08dcce8f0639d266b9"} Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.439284 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" event={"ID":"b029d6b5-4398-42bf-abb7-30bd70de9142","Type":"ContainerDied","Data":"8c9262825bf08b4e25a2d024e2d2d47facd80ce3a2992659a722bca2db235d50"} Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.439330 4837 scope.go:117] "RemoveContainer" containerID="2767bf5c92185163626f471585ef7dfa566e833a02bb3f4ec4d09fe9787ae0c5" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.439386 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7v48" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.468759 4837 scope.go:117] "RemoveContainer" containerID="afd13dec87e53073329562d854e37abdf2286f524d777b60cc343cbf615417cd" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.481421 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-zww9d" podStartSLOduration=2.005368759 podStartE2EDuration="12.481387236s" podCreationTimestamp="2025-10-14 13:15:55 +0000 UTC" firstStartedPulling="2025-10-14 13:15:56.29546438 +0000 UTC m=+894.212464193" lastFinishedPulling="2025-10-14 13:16:06.771482817 +0000 UTC m=+904.688482670" observedRunningTime="2025-10-14 13:16:07.462835328 +0000 UTC m=+905.379835191" watchObservedRunningTime="2025-10-14 13:16:07.481387236 +0000 UTC m=+905.398387099" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.494479 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7v48"] Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.502681 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7v48"] Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.787798 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 14 13:16:07 crc kubenswrapper[4837]: I1014 13:16:07.841604 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 14 13:16:08 crc kubenswrapper[4837]: I1014 13:16:08.794445 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" path="/var/lib/kubelet/pods/b029d6b5-4398-42bf-abb7-30bd70de9142/volumes" Oct 14 13:16:10 crc kubenswrapper[4837]: I1014 13:16:10.906890 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:16:10 crc kubenswrapper[4837]: E1014 13:16:10.907208 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 14 13:16:10 crc kubenswrapper[4837]: E1014 13:16:10.907495 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 14 13:16:10 crc kubenswrapper[4837]: E1014 13:16:10.907557 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift podName:d34918e7-1e17-4d1d-a163-4d2f0539f2d7 nodeName:}" failed. No retries permitted until 2025-10-14 13:16:26.907534274 +0000 UTC m=+924.824534077 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift") pod "swift-storage-0" (UID: "d34918e7-1e17-4d1d-a163-4d2f0539f2d7") : configmap "swift-ring-files" not found Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.140424 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.140510 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.567982 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-smpbv"] Oct 14 13:16:11 crc kubenswrapper[4837]: E1014 13:16:11.568351 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" containerName="init" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.568370 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" containerName="init" Oct 14 13:16:11 crc kubenswrapper[4837]: E1014 13:16:11.568385 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerName="dnsmasq-dns" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.568390 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerName="dnsmasq-dns" Oct 14 13:16:11 crc kubenswrapper[4837]: E1014 13:16:11.568408 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerName="init" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.568415 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerName="init" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.568613 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b029d6b5-4398-42bf-abb7-30bd70de9142" containerName="dnsmasq-dns" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.568635 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="61dc58fa-f2ab-43a8-ba9f-e67cb2050a31" containerName="init" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.569277 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.584701 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-smpbv"] Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.720436 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tt5k\" (UniqueName: \"kubernetes.io/projected/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22-kube-api-access-7tt5k\") pod \"keystone-db-create-smpbv\" (UID: \"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22\") " pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.770318 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rzn49"] Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.773125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzn49" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.801338 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rzn49"] Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.821877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tt5k\" (UniqueName: \"kubernetes.io/projected/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22-kube-api-access-7tt5k\") pod \"keystone-db-create-smpbv\" (UID: \"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22\") " pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.843133 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tt5k\" (UniqueName: \"kubernetes.io/projected/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22-kube-api-access-7tt5k\") pod \"keystone-db-create-smpbv\" (UID: \"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22\") " pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.893980 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:11 crc kubenswrapper[4837]: I1014 13:16:11.923383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9zrz\" (UniqueName: \"kubernetes.io/projected/28b9af55-6774-4230-a7c1-ef5b49d1ab29-kube-api-access-l9zrz\") pod \"placement-db-create-rzn49\" (UID: \"28b9af55-6774-4230-a7c1-ef5b49d1ab29\") " pod="openstack/placement-db-create-rzn49" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.004739 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qm4ns"] Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.009094 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.016439 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qm4ns"] Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.028723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9zrz\" (UniqueName: \"kubernetes.io/projected/28b9af55-6774-4230-a7c1-ef5b49d1ab29-kube-api-access-l9zrz\") pod \"placement-db-create-rzn49\" (UID: \"28b9af55-6774-4230-a7c1-ef5b49d1ab29\") " pod="openstack/placement-db-create-rzn49" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.046911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9zrz\" (UniqueName: \"kubernetes.io/projected/28b9af55-6774-4230-a7c1-ef5b49d1ab29-kube-api-access-l9zrz\") pod \"placement-db-create-rzn49\" (UID: \"28b9af55-6774-4230-a7c1-ef5b49d1ab29\") " pod="openstack/placement-db-create-rzn49" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.117065 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzn49" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.130791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptqbb\" (UniqueName: \"kubernetes.io/projected/47bcfb5e-cbb6-4626-9264-93e2be887f1c-kube-api-access-ptqbb\") pod \"glance-db-create-qm4ns\" (UID: \"47bcfb5e-cbb6-4626-9264-93e2be887f1c\") " pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.232246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptqbb\" (UniqueName: \"kubernetes.io/projected/47bcfb5e-cbb6-4626-9264-93e2be887f1c-kube-api-access-ptqbb\") pod \"glance-db-create-qm4ns\" (UID: \"47bcfb5e-cbb6-4626-9264-93e2be887f1c\") " pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.251512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptqbb\" (UniqueName: \"kubernetes.io/projected/47bcfb5e-cbb6-4626-9264-93e2be887f1c-kube-api-access-ptqbb\") pod \"glance-db-create-qm4ns\" (UID: \"47bcfb5e-cbb6-4626-9264-93e2be887f1c\") " pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.342711 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.343114 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-smpbv"] Oct 14 13:16:12 crc kubenswrapper[4837]: W1014 13:16:12.359729 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36c60778_c2e2_4d03_bbfa_f68e1e8b8c22.slice/crio-f9d5c28ffa84a12a9c64f415f87ab68be1d350748c5714f7c6fb01733afb0e6f WatchSource:0}: Error finding container f9d5c28ffa84a12a9c64f415f87ab68be1d350748c5714f7c6fb01733afb0e6f: Status 404 returned error can't find the container with id f9d5c28ffa84a12a9c64f415f87ab68be1d350748c5714f7c6fb01733afb0e6f Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.492228 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-smpbv" event={"ID":"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22","Type":"ContainerStarted","Data":"f9d5c28ffa84a12a9c64f415f87ab68be1d350748c5714f7c6fb01733afb0e6f"} Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.554893 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rzn49"] Oct 14 13:16:12 crc kubenswrapper[4837]: W1014 13:16:12.568627 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b9af55_6774_4230_a7c1_ef5b49d1ab29.slice/crio-c3963b2fd80e3f9e64663ca7ce750792593f6ab9d17a03c0654656bb38cfb36e WatchSource:0}: Error finding container c3963b2fd80e3f9e64663ca7ce750792593f6ab9d17a03c0654656bb38cfb36e: Status 404 returned error can't find the container with id c3963b2fd80e3f9e64663ca7ce750792593f6ab9d17a03c0654656bb38cfb36e Oct 14 13:16:12 crc kubenswrapper[4837]: W1014 13:16:12.793506 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47bcfb5e_cbb6_4626_9264_93e2be887f1c.slice/crio-8363a45e6b823d57ea8a7311b3d786e947281247b45d2a2a9a892d3c3d9a2314 WatchSource:0}: Error finding container 8363a45e6b823d57ea8a7311b3d786e947281247b45d2a2a9a892d3c3d9a2314: Status 404 returned error can't find the container with id 8363a45e6b823d57ea8a7311b3d786e947281247b45d2a2a9a892d3c3d9a2314 Oct 14 13:16:12 crc kubenswrapper[4837]: I1014 13:16:12.803836 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qm4ns"] Oct 14 13:16:13 crc kubenswrapper[4837]: I1014 13:16:13.304501 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 14 13:16:13 crc kubenswrapper[4837]: I1014 13:16:13.501149 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzn49" event={"ID":"28b9af55-6774-4230-a7c1-ef5b49d1ab29","Type":"ContainerStarted","Data":"c3963b2fd80e3f9e64663ca7ce750792593f6ab9d17a03c0654656bb38cfb36e"} Oct 14 13:16:13 crc kubenswrapper[4837]: I1014 13:16:13.502210 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qm4ns" event={"ID":"47bcfb5e-cbb6-4626-9264-93e2be887f1c","Type":"ContainerStarted","Data":"8363a45e6b823d57ea8a7311b3d786e947281247b45d2a2a9a892d3c3d9a2314"} Oct 14 13:16:14 crc kubenswrapper[4837]: I1014 13:16:14.514470 4837 generic.go:334] "Generic (PLEG): container finished" podID="36c60778-c2e2-4d03-bbfa-f68e1e8b8c22" containerID="d4541d32894d96ad481241adf9f042195e567a7506ae4ce4bcdb491726b0401a" exitCode=0 Oct 14 13:16:14 crc kubenswrapper[4837]: I1014 13:16:14.514530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-smpbv" event={"ID":"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22","Type":"ContainerDied","Data":"d4541d32894d96ad481241adf9f042195e567a7506ae4ce4bcdb491726b0401a"} Oct 14 13:16:14 crc kubenswrapper[4837]: I1014 13:16:14.518801 4837 generic.go:334] "Generic (PLEG): container finished" podID="28b9af55-6774-4230-a7c1-ef5b49d1ab29" containerID="e8baea8a872c4f79e0ba7f7f73b864ec9d181a63b44ae45c6769b1ba70748b38" exitCode=0 Oct 14 13:16:14 crc kubenswrapper[4837]: I1014 13:16:14.518891 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzn49" event={"ID":"28b9af55-6774-4230-a7c1-ef5b49d1ab29","Type":"ContainerDied","Data":"e8baea8a872c4f79e0ba7f7f73b864ec9d181a63b44ae45c6769b1ba70748b38"} Oct 14 13:16:14 crc kubenswrapper[4837]: I1014 13:16:14.520946 4837 generic.go:334] "Generic (PLEG): container finished" podID="47bcfb5e-cbb6-4626-9264-93e2be887f1c" containerID="3d653ebce0d14223e36104910a9de58468ea7f218d46625b8420c021c1e3eb4d" exitCode=0 Oct 14 13:16:14 crc kubenswrapper[4837]: I1014 13:16:14.521012 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qm4ns" event={"ID":"47bcfb5e-cbb6-4626-9264-93e2be887f1c","Type":"ContainerDied","Data":"3d653ebce0d14223e36104910a9de58468ea7f218d46625b8420c021c1e3eb4d"} Oct 14 13:16:15 crc kubenswrapper[4837]: I1014 13:16:15.530820 4837 generic.go:334] "Generic (PLEG): container finished" podID="1960b9c9-0169-447b-a184-21c3522760f8" containerID="568cf609e7ea36aac64eb14eeb16a24c7e02d4c9cfc7ed08dcce8f0639d266b9" exitCode=0 Oct 14 13:16:15 crc kubenswrapper[4837]: I1014 13:16:15.530936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zww9d" event={"ID":"1960b9c9-0169-447b-a184-21c3522760f8","Type":"ContainerDied","Data":"568cf609e7ea36aac64eb14eeb16a24c7e02d4c9cfc7ed08dcce8f0639d266b9"} Oct 14 13:16:15 crc kubenswrapper[4837]: I1014 13:16:15.975500 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:15 crc kubenswrapper[4837]: I1014 13:16:15.981471 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzn49" Oct 14 13:16:15 crc kubenswrapper[4837]: I1014 13:16:15.988180 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.089347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tt5k\" (UniqueName: \"kubernetes.io/projected/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22-kube-api-access-7tt5k\") pod \"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22\" (UID: \"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22\") " Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.089436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptqbb\" (UniqueName: \"kubernetes.io/projected/47bcfb5e-cbb6-4626-9264-93e2be887f1c-kube-api-access-ptqbb\") pod \"47bcfb5e-cbb6-4626-9264-93e2be887f1c\" (UID: \"47bcfb5e-cbb6-4626-9264-93e2be887f1c\") " Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.089486 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9zrz\" (UniqueName: \"kubernetes.io/projected/28b9af55-6774-4230-a7c1-ef5b49d1ab29-kube-api-access-l9zrz\") pod \"28b9af55-6774-4230-a7c1-ef5b49d1ab29\" (UID: \"28b9af55-6774-4230-a7c1-ef5b49d1ab29\") " Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.094800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b9af55-6774-4230-a7c1-ef5b49d1ab29-kube-api-access-l9zrz" (OuterVolumeSpecName: "kube-api-access-l9zrz") pod "28b9af55-6774-4230-a7c1-ef5b49d1ab29" (UID: "28b9af55-6774-4230-a7c1-ef5b49d1ab29"). InnerVolumeSpecName "kube-api-access-l9zrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.094854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47bcfb5e-cbb6-4626-9264-93e2be887f1c-kube-api-access-ptqbb" (OuterVolumeSpecName: "kube-api-access-ptqbb") pod "47bcfb5e-cbb6-4626-9264-93e2be887f1c" (UID: "47bcfb5e-cbb6-4626-9264-93e2be887f1c"). InnerVolumeSpecName "kube-api-access-ptqbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.094853 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22-kube-api-access-7tt5k" (OuterVolumeSpecName: "kube-api-access-7tt5k") pod "36c60778-c2e2-4d03-bbfa-f68e1e8b8c22" (UID: "36c60778-c2e2-4d03-bbfa-f68e1e8b8c22"). InnerVolumeSpecName "kube-api-access-7tt5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.191442 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tt5k\" (UniqueName: \"kubernetes.io/projected/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22-kube-api-access-7tt5k\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.191499 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptqbb\" (UniqueName: \"kubernetes.io/projected/47bcfb5e-cbb6-4626-9264-93e2be887f1c-kube-api-access-ptqbb\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.191517 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9zrz\" (UniqueName: \"kubernetes.io/projected/28b9af55-6774-4230-a7c1-ef5b49d1ab29-kube-api-access-l9zrz\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.541061 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzn49" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.541116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzn49" event={"ID":"28b9af55-6774-4230-a7c1-ef5b49d1ab29","Type":"ContainerDied","Data":"c3963b2fd80e3f9e64663ca7ce750792593f6ab9d17a03c0654656bb38cfb36e"} Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.541177 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3963b2fd80e3f9e64663ca7ce750792593f6ab9d17a03c0654656bb38cfb36e" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.543251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qm4ns" event={"ID":"47bcfb5e-cbb6-4626-9264-93e2be887f1c","Type":"ContainerDied","Data":"8363a45e6b823d57ea8a7311b3d786e947281247b45d2a2a9a892d3c3d9a2314"} Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.543275 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qm4ns" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.543295 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8363a45e6b823d57ea8a7311b3d786e947281247b45d2a2a9a892d3c3d9a2314" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.545591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-smpbv" event={"ID":"36c60778-c2e2-4d03-bbfa-f68e1e8b8c22","Type":"ContainerDied","Data":"f9d5c28ffa84a12a9c64f415f87ab68be1d350748c5714f7c6fb01733afb0e6f"} Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.545656 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d5c28ffa84a12a9c64f415f87ab68be1d350748c5714f7c6fb01733afb0e6f" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.545702 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smpbv" Oct 14 13:16:16 crc kubenswrapper[4837]: I1014 13:16:16.941029 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.005840 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-ring-data-devices\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.005886 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-dispersionconf\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.005949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1960b9c9-0169-447b-a184-21c3522760f8-etc-swift\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.005972 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n8bj\" (UniqueName: \"kubernetes.io/projected/1960b9c9-0169-447b-a184-21c3522760f8-kube-api-access-2n8bj\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.006016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-swiftconf\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.006060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-combined-ca-bundle\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.006084 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-scripts\") pod \"1960b9c9-0169-447b-a184-21c3522760f8\" (UID: \"1960b9c9-0169-447b-a184-21c3522760f8\") " Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.007523 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1960b9c9-0169-447b-a184-21c3522760f8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.008173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.020246 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1960b9c9-0169-447b-a184-21c3522760f8-kube-api-access-2n8bj" (OuterVolumeSpecName: "kube-api-access-2n8bj") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "kube-api-access-2n8bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.025032 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.049982 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-scripts" (OuterVolumeSpecName: "scripts") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.076341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.100335 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1960b9c9-0169-447b-a184-21c3522760f8" (UID: "1960b9c9-0169-447b-a184-21c3522760f8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108333 4837 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108538 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108550 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108558 4837 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1960b9c9-0169-447b-a184-21c3522760f8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108569 4837 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1960b9c9-0169-447b-a184-21c3522760f8-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108577 4837 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1960b9c9-0169-447b-a184-21c3522760f8-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.108585 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n8bj\" (UniqueName: \"kubernetes.io/projected/1960b9c9-0169-447b-a184-21c3522760f8-kube-api-access-2n8bj\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.568430 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zww9d" event={"ID":"1960b9c9-0169-447b-a184-21c3522760f8","Type":"ContainerDied","Data":"4135403cfce8ea79de057ec35ed0d1235c76e549acf5f6308fd7d59ab838993c"} Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.568480 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4135403cfce8ea79de057ec35ed0d1235c76e549acf5f6308fd7d59ab838993c" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.568494 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zww9d" Oct 14 13:16:17 crc kubenswrapper[4837]: I1014 13:16:17.832949 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j4tpc" podUID="14f970e0-8d42-46d6-937a-c39f521f6bea" containerName="ovn-controller" probeResult="failure" output=< Oct 14 13:16:17 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 14 13:16:17 crc kubenswrapper[4837]: > Oct 14 13:16:19 crc kubenswrapper[4837]: I1014 13:16:19.585450 4837 generic.go:334] "Generic (PLEG): container finished" podID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerID="3f5e3ffe8a2b7184a62f038431da9b3670de00f0a32b5022a406da34cfe945a7" exitCode=0 Oct 14 13:16:19 crc kubenswrapper[4837]: I1014 13:16:19.585524 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0fcd80e-9aec-4608-bfc5-653c443d1849","Type":"ContainerDied","Data":"3f5e3ffe8a2b7184a62f038431da9b3670de00f0a32b5022a406da34cfe945a7"} Oct 14 13:16:19 crc kubenswrapper[4837]: I1014 13:16:19.587510 4837 generic.go:334] "Generic (PLEG): container finished" podID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerID="46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce" exitCode=0 Oct 14 13:16:19 crc kubenswrapper[4837]: I1014 13:16:19.587551 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6877e694-37ca-4cd4-ba01-3101d4f7ade4","Type":"ContainerDied","Data":"46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce"} Oct 14 13:16:20 crc kubenswrapper[4837]: I1014 13:16:20.598112 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6877e694-37ca-4cd4-ba01-3101d4f7ade4","Type":"ContainerStarted","Data":"0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe"} Oct 14 13:16:20 crc kubenswrapper[4837]: I1014 13:16:20.599229 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:16:20 crc kubenswrapper[4837]: I1014 13:16:20.601378 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0fcd80e-9aec-4608-bfc5-653c443d1849","Type":"ContainerStarted","Data":"987dc060a05aa5aefc79f1046096b30535fadb959f3f2adb2f63161a8f110645"} Oct 14 13:16:20 crc kubenswrapper[4837]: I1014 13:16:20.601988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 14 13:16:20 crc kubenswrapper[4837]: I1014 13:16:20.622928 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=54.705363207 podStartE2EDuration="1m3.622910898s" podCreationTimestamp="2025-10-14 13:15:17 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.706420637 +0000 UTC m=+873.623420450" lastFinishedPulling="2025-10-14 13:15:44.623968318 +0000 UTC m=+882.540968141" observedRunningTime="2025-10-14 13:16:20.618130817 +0000 UTC m=+918.535130630" watchObservedRunningTime="2025-10-14 13:16:20.622910898 +0000 UTC m=+918.539910711" Oct 14 13:16:20 crc kubenswrapper[4837]: I1014 13:16:20.648817 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=55.073401189 podStartE2EDuration="1m3.648800386s" podCreationTimestamp="2025-10-14 13:15:17 +0000 UTC" firstStartedPulling="2025-10-14 13:15:35.706678924 +0000 UTC m=+873.623678737" lastFinishedPulling="2025-10-14 13:15:44.282078081 +0000 UTC m=+882.199077934" observedRunningTime="2025-10-14 13:16:20.647075189 +0000 UTC m=+918.564075002" watchObservedRunningTime="2025-10-14 13:16:20.648800386 +0000 UTC m=+918.565800199" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.159934 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c6dd-account-create-xw7kc"] Oct 14 13:16:22 crc kubenswrapper[4837]: E1014 13:16:22.160326 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b9af55-6774-4230-a7c1-ef5b49d1ab29" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160345 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b9af55-6774-4230-a7c1-ef5b49d1ab29" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: E1014 13:16:22.160356 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1960b9c9-0169-447b-a184-21c3522760f8" containerName="swift-ring-rebalance" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160365 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1960b9c9-0169-447b-a184-21c3522760f8" containerName="swift-ring-rebalance" Oct 14 13:16:22 crc kubenswrapper[4837]: E1014 13:16:22.160385 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c60778-c2e2-4d03-bbfa-f68e1e8b8c22" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160394 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c60778-c2e2-4d03-bbfa-f68e1e8b8c22" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: E1014 13:16:22.160406 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bcfb5e-cbb6-4626-9264-93e2be887f1c" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160414 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bcfb5e-cbb6-4626-9264-93e2be887f1c" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160621 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1960b9c9-0169-447b-a184-21c3522760f8" containerName="swift-ring-rebalance" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160637 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b9af55-6774-4230-a7c1-ef5b49d1ab29" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160655 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="47bcfb5e-cbb6-4626-9264-93e2be887f1c" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.160670 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c60778-c2e2-4d03-bbfa-f68e1e8b8c22" containerName="mariadb-database-create" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.161336 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.163636 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.184258 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j9x\" (UniqueName: \"kubernetes.io/projected/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e-kube-api-access-54j9x\") pod \"glance-c6dd-account-create-xw7kc\" (UID: \"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e\") " pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.226627 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c6dd-account-create-xw7kc"] Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.285493 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j9x\" (UniqueName: \"kubernetes.io/projected/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e-kube-api-access-54j9x\") pod \"glance-c6dd-account-create-xw7kc\" (UID: \"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e\") " pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.304682 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j9x\" (UniqueName: \"kubernetes.io/projected/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e-kube-api-access-54j9x\") pod \"glance-c6dd-account-create-xw7kc\" (UID: \"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e\") " pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.481824 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.839751 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j4tpc" podUID="14f970e0-8d42-46d6-937a-c39f521f6bea" containerName="ovn-controller" probeResult="failure" output=< Oct 14 13:16:22 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 14 13:16:22 crc kubenswrapper[4837]: > Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.875834 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:16:22 crc kubenswrapper[4837]: I1014 13:16:22.888125 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cp8xg" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.049244 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c6dd-account-create-xw7kc"] Oct 14 13:16:23 crc kubenswrapper[4837]: W1014 13:16:23.059812 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d1fa01e_58a0_4877_b0c8_77e6f6de1f6e.slice/crio-89e4ac2ee089325cb6d4722d44db05fc74b3ec9f830ab43a34a8d5bccc7698e8 WatchSource:0}: Error finding container 89e4ac2ee089325cb6d4722d44db05fc74b3ec9f830ab43a34a8d5bccc7698e8: Status 404 returned error can't find the container with id 89e4ac2ee089325cb6d4722d44db05fc74b3ec9f830ab43a34a8d5bccc7698e8 Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.141406 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j4tpc-config-22c7g"] Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.142426 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.144940 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.156752 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j4tpc-config-22c7g"] Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.304570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-log-ovn\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.304643 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-additional-scripts\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.305838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.306042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-scripts\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.306146 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sc94\" (UniqueName: \"kubernetes.io/projected/dfa0b792-e2ef-479e-9e25-518d962509bb-kube-api-access-5sc94\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.306344 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run-ovn\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407092 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-scripts\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sc94\" (UniqueName: \"kubernetes.io/projected/dfa0b792-e2ef-479e-9e25-518d962509bb-kube-api-access-5sc94\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407149 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run-ovn\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407203 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-log-ovn\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-additional-scripts\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407327 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407709 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run-ovn\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.407844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-log-ovn\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.408088 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-additional-scripts\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.409584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-scripts\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.427471 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sc94\" (UniqueName: \"kubernetes.io/projected/dfa0b792-e2ef-479e-9e25-518d962509bb-kube-api-access-5sc94\") pod \"ovn-controller-j4tpc-config-22c7g\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.458479 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.633041 4837 generic.go:334] "Generic (PLEG): container finished" podID="4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e" containerID="25aa3738b380d549592e12ec80b99ff42795198087c88184161d295bfe1a7e76" exitCode=0 Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.633215 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c6dd-account-create-xw7kc" event={"ID":"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e","Type":"ContainerDied","Data":"25aa3738b380d549592e12ec80b99ff42795198087c88184161d295bfe1a7e76"} Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.633573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c6dd-account-create-xw7kc" event={"ID":"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e","Type":"ContainerStarted","Data":"89e4ac2ee089325cb6d4722d44db05fc74b3ec9f830ab43a34a8d5bccc7698e8"} Oct 14 13:16:23 crc kubenswrapper[4837]: I1014 13:16:23.926763 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j4tpc-config-22c7g"] Oct 14 13:16:24 crc kubenswrapper[4837]: I1014 13:16:24.646978 4837 generic.go:334] "Generic (PLEG): container finished" podID="dfa0b792-e2ef-479e-9e25-518d962509bb" containerID="4ae2696762c4caa0ad09d2879e82fa26cff9e32993154d92464c425658467580" exitCode=0 Oct 14 13:16:24 crc kubenswrapper[4837]: I1014 13:16:24.647547 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j4tpc-config-22c7g" event={"ID":"dfa0b792-e2ef-479e-9e25-518d962509bb","Type":"ContainerDied","Data":"4ae2696762c4caa0ad09d2879e82fa26cff9e32993154d92464c425658467580"} Oct 14 13:16:24 crc kubenswrapper[4837]: I1014 13:16:24.647586 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j4tpc-config-22c7g" event={"ID":"dfa0b792-e2ef-479e-9e25-518d962509bb","Type":"ContainerStarted","Data":"db03750e1c0252d1fd455817b18734b286baa4148be493eb4efadfe9686f84ad"} Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.021687 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.136502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54j9x\" (UniqueName: \"kubernetes.io/projected/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e-kube-api-access-54j9x\") pod \"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e\" (UID: \"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e\") " Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.143340 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e-kube-api-access-54j9x" (OuterVolumeSpecName: "kube-api-access-54j9x") pod "4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e" (UID: "4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e"). InnerVolumeSpecName "kube-api-access-54j9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.238058 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54j9x\" (UniqueName: \"kubernetes.io/projected/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e-kube-api-access-54j9x\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.660710 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c6dd-account-create-xw7kc" Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.660702 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c6dd-account-create-xw7kc" event={"ID":"4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e","Type":"ContainerDied","Data":"89e4ac2ee089325cb6d4722d44db05fc74b3ec9f830ab43a34a8d5bccc7698e8"} Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.661260 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89e4ac2ee089325cb6d4722d44db05fc74b3ec9f830ab43a34a8d5bccc7698e8" Oct 14 13:16:25 crc kubenswrapper[4837]: I1014 13:16:25.965134 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.051883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-scripts\") pod \"dfa0b792-e2ef-479e-9e25-518d962509bb\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.051929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run-ovn\") pod \"dfa0b792-e2ef-479e-9e25-518d962509bb\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.051953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run\") pod \"dfa0b792-e2ef-479e-9e25-518d962509bb\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.051999 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-log-ovn\") pod \"dfa0b792-e2ef-479e-9e25-518d962509bb\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052049 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-additional-scripts\") pod \"dfa0b792-e2ef-479e-9e25-518d962509bb\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052063 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "dfa0b792-e2ef-479e-9e25-518d962509bb" (UID: "dfa0b792-e2ef-479e-9e25-518d962509bb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052080 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sc94\" (UniqueName: \"kubernetes.io/projected/dfa0b792-e2ef-479e-9e25-518d962509bb-kube-api-access-5sc94\") pod \"dfa0b792-e2ef-479e-9e25-518d962509bb\" (UID: \"dfa0b792-e2ef-479e-9e25-518d962509bb\") " Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052091 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run" (OuterVolumeSpecName: "var-run") pod "dfa0b792-e2ef-479e-9e25-518d962509bb" (UID: "dfa0b792-e2ef-479e-9e25-518d962509bb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052116 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "dfa0b792-e2ef-479e-9e25-518d962509bb" (UID: "dfa0b792-e2ef-479e-9e25-518d962509bb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052690 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052716 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-run\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052725 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dfa0b792-e2ef-479e-9e25-518d962509bb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.052939 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "dfa0b792-e2ef-479e-9e25-518d962509bb" (UID: "dfa0b792-e2ef-479e-9e25-518d962509bb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.053229 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-scripts" (OuterVolumeSpecName: "scripts") pod "dfa0b792-e2ef-479e-9e25-518d962509bb" (UID: "dfa0b792-e2ef-479e-9e25-518d962509bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.062624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa0b792-e2ef-479e-9e25-518d962509bb-kube-api-access-5sc94" (OuterVolumeSpecName: "kube-api-access-5sc94") pod "dfa0b792-e2ef-479e-9e25-518d962509bb" (UID: "dfa0b792-e2ef-479e-9e25-518d962509bb"). InnerVolumeSpecName "kube-api-access-5sc94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.153388 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.153439 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa0b792-e2ef-479e-9e25-518d962509bb-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.153456 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sc94\" (UniqueName: \"kubernetes.io/projected/dfa0b792-e2ef-479e-9e25-518d962509bb-kube-api-access-5sc94\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.670101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j4tpc-config-22c7g" event={"ID":"dfa0b792-e2ef-479e-9e25-518d962509bb","Type":"ContainerDied","Data":"db03750e1c0252d1fd455817b18734b286baa4148be493eb4efadfe9686f84ad"} Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.670151 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db03750e1c0252d1fd455817b18734b286baa4148be493eb4efadfe9686f84ad" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.670207 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j4tpc-config-22c7g" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.964484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:16:26 crc kubenswrapper[4837]: I1014 13:16:26.971204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d34918e7-1e17-4d1d-a163-4d2f0539f2d7-etc-swift\") pod \"swift-storage-0\" (UID: \"d34918e7-1e17-4d1d-a163-4d2f0539f2d7\") " pod="openstack/swift-storage-0" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.065668 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-j4tpc-config-22c7g"] Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.077834 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-j4tpc-config-22c7g"] Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.091434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.347003 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-f7lcc"] Oct 14 13:16:27 crc kubenswrapper[4837]: E1014 13:16:27.347424 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa0b792-e2ef-479e-9e25-518d962509bb" containerName="ovn-config" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.347441 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa0b792-e2ef-479e-9e25-518d962509bb" containerName="ovn-config" Oct 14 13:16:27 crc kubenswrapper[4837]: E1014 13:16:27.347461 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e" containerName="mariadb-account-create" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.347469 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e" containerName="mariadb-account-create" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.347680 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa0b792-e2ef-479e-9e25-518d962509bb" containerName="ovn-config" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.347698 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e" containerName="mariadb-account-create" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.348360 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.353728 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h5zj9" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.354287 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.355076 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-f7lcc"] Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.373881 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-db-sync-config-data\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.374596 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-config-data\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.374665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg678\" (UniqueName: \"kubernetes.io/projected/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-kube-api-access-vg678\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.374740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-combined-ca-bundle\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.476309 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-config-data\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.476390 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg678\" (UniqueName: \"kubernetes.io/projected/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-kube-api-access-vg678\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.476428 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-combined-ca-bundle\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.476467 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-db-sync-config-data\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.481334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-db-sync-config-data\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.481457 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-config-data\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.483748 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-combined-ca-bundle\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.497134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg678\" (UniqueName: \"kubernetes.io/projected/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-kube-api-access-vg678\") pod \"glance-db-sync-f7lcc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.678276 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.686418 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 14 13:16:27 crc kubenswrapper[4837]: I1014 13:16:27.816488 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-j4tpc" Oct 14 13:16:28 crc kubenswrapper[4837]: W1014 13:16:28.039913 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23ad4da0_18fa_4eba_990c_4c9c80d4ecdc.slice/crio-b60c7c572ba69b254ae6df7203496ae42f7f50af272707912f7654cfd25817eb WatchSource:0}: Error finding container b60c7c572ba69b254ae6df7203496ae42f7f50af272707912f7654cfd25817eb: Status 404 returned error can't find the container with id b60c7c572ba69b254ae6df7203496ae42f7f50af272707912f7654cfd25817eb Oct 14 13:16:28 crc kubenswrapper[4837]: I1014 13:16:28.041781 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-f7lcc"] Oct 14 13:16:28 crc kubenswrapper[4837]: I1014 13:16:28.689322 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"5dab23548cb784f33f9903950fb64a3848bf5effd35110c30fdaeaebc4c2c960"} Oct 14 13:16:28 crc kubenswrapper[4837]: I1014 13:16:28.691593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f7lcc" event={"ID":"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc","Type":"ContainerStarted","Data":"b60c7c572ba69b254ae6df7203496ae42f7f50af272707912f7654cfd25817eb"} Oct 14 13:16:28 crc kubenswrapper[4837]: I1014 13:16:28.800265 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa0b792-e2ef-479e-9e25-518d962509bb" path="/var/lib/kubelet/pods/dfa0b792-e2ef-479e-9e25-518d962509bb/volumes" Oct 14 13:16:29 crc kubenswrapper[4837]: I1014 13:16:29.013809 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 14 13:16:29 crc kubenswrapper[4837]: I1014 13:16:29.702114 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"4a87e14543a46e90c1da230960b633fde05e9f01903f0b84faed9772a9e0b213"} Oct 14 13:16:29 crc kubenswrapper[4837]: I1014 13:16:29.702185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"83fc2ef580b8a4ae21a24d418a39aee06b4900efab401025afd5c647794badb0"} Oct 14 13:16:30 crc kubenswrapper[4837]: I1014 13:16:30.718848 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"6466ca8c298a2f7d85c04f4fc6edd97afa9892a34cfe722f14081fd03d0e9741"} Oct 14 13:16:30 crc kubenswrapper[4837]: I1014 13:16:30.719188 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"d814e88593d0bcefe85d536035885bfcc4be2c5682797a3f3bb491ea5898e304"} Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.680553 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-96ae-account-create-qxl8c"] Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.681500 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.686695 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.696102 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-96ae-account-create-qxl8c"] Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.795683 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/33eb2486-60b1-4fc5-aeeb-1a1855693079-kube-api-access-2lzxt\") pod \"keystone-96ae-account-create-qxl8c\" (UID: \"33eb2486-60b1-4fc5-aeeb-1a1855693079\") " pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.866284 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d256-account-create-q8d4t"] Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.867613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.869345 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.872344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d256-account-create-q8d4t"] Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.897595 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/33eb2486-60b1-4fc5-aeeb-1a1855693079-kube-api-access-2lzxt\") pod \"keystone-96ae-account-create-qxl8c\" (UID: \"33eb2486-60b1-4fc5-aeeb-1a1855693079\") " pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.897877 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqlz\" (UniqueName: \"kubernetes.io/projected/9a07282f-11f0-40b7-af8f-32fd266b70de-kube-api-access-5sqlz\") pod \"placement-d256-account-create-q8d4t\" (UID: \"9a07282f-11f0-40b7-af8f-32fd266b70de\") " pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.929192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/33eb2486-60b1-4fc5-aeeb-1a1855693079-kube-api-access-2lzxt\") pod \"keystone-96ae-account-create-qxl8c\" (UID: \"33eb2486-60b1-4fc5-aeeb-1a1855693079\") " pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:31 crc kubenswrapper[4837]: I1014 13:16:31.999865 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqlz\" (UniqueName: \"kubernetes.io/projected/9a07282f-11f0-40b7-af8f-32fd266b70de-kube-api-access-5sqlz\") pod \"placement-d256-account-create-q8d4t\" (UID: \"9a07282f-11f0-40b7-af8f-32fd266b70de\") " pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:32 crc kubenswrapper[4837]: I1014 13:16:32.017799 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:32 crc kubenswrapper[4837]: I1014 13:16:32.018005 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqlz\" (UniqueName: \"kubernetes.io/projected/9a07282f-11f0-40b7-af8f-32fd266b70de-kube-api-access-5sqlz\") pod \"placement-d256-account-create-q8d4t\" (UID: \"9a07282f-11f0-40b7-af8f-32fd266b70de\") " pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:32 crc kubenswrapper[4837]: I1014 13:16:32.193207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:33 crc kubenswrapper[4837]: I1014 13:16:33.422170 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-96ae-account-create-qxl8c"] Oct 14 13:16:33 crc kubenswrapper[4837]: I1014 13:16:33.469297 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d256-account-create-q8d4t"] Oct 14 13:16:33 crc kubenswrapper[4837]: W1014 13:16:33.612062 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33eb2486_60b1_4fc5_aeeb_1a1855693079.slice/crio-90b6e4b6a3010c831f6c16dfc008d0f2a391737d61946664632a35120a9f86fa WatchSource:0}: Error finding container 90b6e4b6a3010c831f6c16dfc008d0f2a391737d61946664632a35120a9f86fa: Status 404 returned error can't find the container with id 90b6e4b6a3010c831f6c16dfc008d0f2a391737d61946664632a35120a9f86fa Oct 14 13:16:33 crc kubenswrapper[4837]: W1014 13:16:33.615134 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a07282f_11f0_40b7_af8f_32fd266b70de.slice/crio-c37c9e19130e4c9b6026091228a7af7123a61e80171c9bd96606e78734766288 WatchSource:0}: Error finding container c37c9e19130e4c9b6026091228a7af7123a61e80171c9bd96606e78734766288: Status 404 returned error can't find the container with id c37c9e19130e4c9b6026091228a7af7123a61e80171c9bd96606e78734766288 Oct 14 13:16:33 crc kubenswrapper[4837]: I1014 13:16:33.756317 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-96ae-account-create-qxl8c" event={"ID":"33eb2486-60b1-4fc5-aeeb-1a1855693079","Type":"ContainerStarted","Data":"90b6e4b6a3010c831f6c16dfc008d0f2a391737d61946664632a35120a9f86fa"} Oct 14 13:16:33 crc kubenswrapper[4837]: I1014 13:16:33.757828 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d256-account-create-q8d4t" event={"ID":"9a07282f-11f0-40b7-af8f-32fd266b70de","Type":"ContainerStarted","Data":"c37c9e19130e4c9b6026091228a7af7123a61e80171c9bd96606e78734766288"} Oct 14 13:16:38 crc kubenswrapper[4837]: I1014 13:16:38.946461 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:16:39 crc kubenswrapper[4837]: I1014 13:16:39.015281 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.730725 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qkhg2"] Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.731996 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.749268 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qkhg2"] Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.827285 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7b2p6"] Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.828391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.837103 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7b2p6"] Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.846453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d256-account-create-q8d4t" event={"ID":"9a07282f-11f0-40b7-af8f-32fd266b70de","Type":"ContainerStarted","Data":"32f2264f100ce8e8620a783ddc1527cd23f2cb4c2f6ba2f0f5880c121cc16202"} Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.847431 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6nvk\" (UniqueName: \"kubernetes.io/projected/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc-kube-api-access-r6nvk\") pod \"cinder-db-create-qkhg2\" (UID: \"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc\") " pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.949054 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6nvk\" (UniqueName: \"kubernetes.io/projected/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc-kube-api-access-r6nvk\") pod \"cinder-db-create-qkhg2\" (UID: \"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc\") " pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.949151 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6vs\" (UniqueName: \"kubernetes.io/projected/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c-kube-api-access-rs6vs\") pod \"barbican-db-create-7b2p6\" (UID: \"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c\") " pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:40 crc kubenswrapper[4837]: I1014 13:16:40.965795 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6nvk\" (UniqueName: \"kubernetes.io/projected/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc-kube-api-access-r6nvk\") pod \"cinder-db-create-qkhg2\" (UID: \"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc\") " pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.023278 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5n9j7"] Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.028476 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.031262 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5n9j7"] Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.053323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6vs\" (UniqueName: \"kubernetes.io/projected/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c-kube-api-access-rs6vs\") pod \"barbican-db-create-7b2p6\" (UID: \"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c\") " pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.055576 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.072817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6vs\" (UniqueName: \"kubernetes.io/projected/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c-kube-api-access-rs6vs\") pod \"barbican-db-create-7b2p6\" (UID: \"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c\") " pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.140414 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.140487 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.140532 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.142299 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f7061072f040d06169aa6c27c24b779e700a974c29cdb9d45439f3b10ea132d"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.142749 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://2f7061072f040d06169aa6c27c24b779e700a974c29cdb9d45439f3b10ea132d" gracePeriod=600 Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.155143 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55v6g\" (UniqueName: \"kubernetes.io/projected/266f6dd3-65a1-49e5-a904-66ed929e8718-kube-api-access-55v6g\") pod \"neutron-db-create-5n9j7\" (UID: \"266f6dd3-65a1-49e5-a904-66ed929e8718\") " pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.160732 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.258796 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55v6g\" (UniqueName: \"kubernetes.io/projected/266f6dd3-65a1-49e5-a904-66ed929e8718-kube-api-access-55v6g\") pod \"neutron-db-create-5n9j7\" (UID: \"266f6dd3-65a1-49e5-a904-66ed929e8718\") " pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.289115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55v6g\" (UniqueName: \"kubernetes.io/projected/266f6dd3-65a1-49e5-a904-66ed929e8718-kube-api-access-55v6g\") pod \"neutron-db-create-5n9j7\" (UID: \"266f6dd3-65a1-49e5-a904-66ed929e8718\") " pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.347599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.360796 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qkhg2"] Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.678947 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7b2p6"] Oct 14 13:16:41 crc kubenswrapper[4837]: W1014 13:16:41.686031 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff0b9396_170a_4cdd_b5d9_dbddfdb17f2c.slice/crio-476c778ff760f88a84d49d73dfb6a6d884bf61802902c3ebda8ff8178d2b36ba WatchSource:0}: Error finding container 476c778ff760f88a84d49d73dfb6a6d884bf61802902c3ebda8ff8178d2b36ba: Status 404 returned error can't find the container with id 476c778ff760f88a84d49d73dfb6a6d884bf61802902c3ebda8ff8178d2b36ba Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.829548 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5n9j7"] Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.855997 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="2f7061072f040d06169aa6c27c24b779e700a974c29cdb9d45439f3b10ea132d" exitCode=0 Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.856055 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"2f7061072f040d06169aa6c27c24b779e700a974c29cdb9d45439f3b10ea132d"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.856081 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.856096 4837 scope.go:117] "RemoveContainer" containerID="a9c5da248ef4f304e8c83104496af5297a77f5eb3df38f2188353642fbfdb087" Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.869576 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qkhg2" event={"ID":"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc","Type":"ContainerStarted","Data":"b9aa9ab4b24b292f43bb72b34a88b943c063f3802bfe41f80fec488f0da9134f"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.869782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qkhg2" event={"ID":"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc","Type":"ContainerStarted","Data":"53565c5d30defcc5c82a64d450d377b9b51a622f013d433e64d636753e3c8009"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.871114 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7b2p6" event={"ID":"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c","Type":"ContainerStarted","Data":"476c778ff760f88a84d49d73dfb6a6d884bf61802902c3ebda8ff8178d2b36ba"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.892312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"73d534d5d27fa5f55318ab56c797fd1854e147df9a35bfef828f57c25149a268"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.892362 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"7cddb790ba259aca70866656482484ac69d581860e8b028a3fe621fe3c9292db"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.892376 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"8500c449c169744905970edeeb167b22f986c822d7a4330651215213faf7ab73"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.892386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"7ef3646d580b3578a0c537fe0df33772d11a38f2a5308ffbfc40177a0c536e07"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.895347 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5n9j7" event={"ID":"266f6dd3-65a1-49e5-a904-66ed929e8718","Type":"ContainerStarted","Data":"00b0b33b43cd4818409e230e593caddb6fbc754005fbea2831da67f165d97784"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.901518 4837 generic.go:334] "Generic (PLEG): container finished" podID="33eb2486-60b1-4fc5-aeeb-1a1855693079" containerID="6c5948fe3c127d0fb3be578025f4b48e00bf4e592a793ea27559812b57de7b38" exitCode=0 Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.901623 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-96ae-account-create-qxl8c" event={"ID":"33eb2486-60b1-4fc5-aeeb-1a1855693079","Type":"ContainerDied","Data":"6c5948fe3c127d0fb3be578025f4b48e00bf4e592a793ea27559812b57de7b38"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.904856 4837 generic.go:334] "Generic (PLEG): container finished" podID="9a07282f-11f0-40b7-af8f-32fd266b70de" containerID="32f2264f100ce8e8620a783ddc1527cd23f2cb4c2f6ba2f0f5880c121cc16202" exitCode=0 Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.904931 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d256-account-create-q8d4t" event={"ID":"9a07282f-11f0-40b7-af8f-32fd266b70de","Type":"ContainerDied","Data":"32f2264f100ce8e8620a783ddc1527cd23f2cb4c2f6ba2f0f5880c121cc16202"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.907702 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f7lcc" event={"ID":"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc","Type":"ContainerStarted","Data":"e4a55f3394f1cb547aa6a393bc2b03463e445b7d3398a6c16bad0f18eb331be1"} Oct 14 13:16:41 crc kubenswrapper[4837]: I1014 13:16:41.951009 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-f7lcc" podStartSLOduration=2.473143387 podStartE2EDuration="14.950993814s" podCreationTimestamp="2025-10-14 13:16:27 +0000 UTC" firstStartedPulling="2025-10-14 13:16:28.042300579 +0000 UTC m=+925.959300392" lastFinishedPulling="2025-10-14 13:16:40.520151006 +0000 UTC m=+938.437150819" observedRunningTime="2025-10-14 13:16:41.949654187 +0000 UTC m=+939.866654010" watchObservedRunningTime="2025-10-14 13:16:41.950993814 +0000 UTC m=+939.867993627" Oct 14 13:16:42 crc kubenswrapper[4837]: I1014 13:16:42.922446 4837 generic.go:334] "Generic (PLEG): container finished" podID="ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c" containerID="ab70080b6331a8ef61c921a1cd4a01149fda6956950a609e5b8d0f75db9cbd56" exitCode=0 Oct 14 13:16:42 crc kubenswrapper[4837]: I1014 13:16:42.923037 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7b2p6" event={"ID":"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c","Type":"ContainerDied","Data":"ab70080b6331a8ef61c921a1cd4a01149fda6956950a609e5b8d0f75db9cbd56"} Oct 14 13:16:42 crc kubenswrapper[4837]: I1014 13:16:42.927316 4837 generic.go:334] "Generic (PLEG): container finished" podID="266f6dd3-65a1-49e5-a904-66ed929e8718" containerID="999aab289e810a4bf2e8ae2d0732217e105c3af8a373dd907b66adea15b451f1" exitCode=0 Oct 14 13:16:42 crc kubenswrapper[4837]: I1014 13:16:42.927462 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5n9j7" event={"ID":"266f6dd3-65a1-49e5-a904-66ed929e8718","Type":"ContainerDied","Data":"999aab289e810a4bf2e8ae2d0732217e105c3af8a373dd907b66adea15b451f1"} Oct 14 13:16:42 crc kubenswrapper[4837]: I1014 13:16:42.938378 4837 generic.go:334] "Generic (PLEG): container finished" podID="c5fd4c34-4f7d-493b-b792-d3e20b82d5cc" containerID="b9aa9ab4b24b292f43bb72b34a88b943c063f3802bfe41f80fec488f0da9134f" exitCode=0 Oct 14 13:16:42 crc kubenswrapper[4837]: I1014 13:16:42.938583 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qkhg2" event={"ID":"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc","Type":"ContainerDied","Data":"b9aa9ab4b24b292f43bb72b34a88b943c063f3802bfe41f80fec488f0da9134f"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.369371 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.375107 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.391140 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.504991 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6nvk\" (UniqueName: \"kubernetes.io/projected/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc-kube-api-access-r6nvk\") pod \"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc\" (UID: \"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc\") " Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.505113 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/33eb2486-60b1-4fc5-aeeb-1a1855693079-kube-api-access-2lzxt\") pod \"33eb2486-60b1-4fc5-aeeb-1a1855693079\" (UID: \"33eb2486-60b1-4fc5-aeeb-1a1855693079\") " Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.505223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sqlz\" (UniqueName: \"kubernetes.io/projected/9a07282f-11f0-40b7-af8f-32fd266b70de-kube-api-access-5sqlz\") pod \"9a07282f-11f0-40b7-af8f-32fd266b70de\" (UID: \"9a07282f-11f0-40b7-af8f-32fd266b70de\") " Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.510657 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc-kube-api-access-r6nvk" (OuterVolumeSpecName: "kube-api-access-r6nvk") pod "c5fd4c34-4f7d-493b-b792-d3e20b82d5cc" (UID: "c5fd4c34-4f7d-493b-b792-d3e20b82d5cc"). InnerVolumeSpecName "kube-api-access-r6nvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.510732 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33eb2486-60b1-4fc5-aeeb-1a1855693079-kube-api-access-2lzxt" (OuterVolumeSpecName: "kube-api-access-2lzxt") pod "33eb2486-60b1-4fc5-aeeb-1a1855693079" (UID: "33eb2486-60b1-4fc5-aeeb-1a1855693079"). InnerVolumeSpecName "kube-api-access-2lzxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.512777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a07282f-11f0-40b7-af8f-32fd266b70de-kube-api-access-5sqlz" (OuterVolumeSpecName: "kube-api-access-5sqlz") pod "9a07282f-11f0-40b7-af8f-32fd266b70de" (UID: "9a07282f-11f0-40b7-af8f-32fd266b70de"). InnerVolumeSpecName "kube-api-access-5sqlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.607539 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/33eb2486-60b1-4fc5-aeeb-1a1855693079-kube-api-access-2lzxt\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.607580 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sqlz\" (UniqueName: \"kubernetes.io/projected/9a07282f-11f0-40b7-af8f-32fd266b70de-kube-api-access-5sqlz\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.607590 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6nvk\" (UniqueName: \"kubernetes.io/projected/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc-kube-api-access-r6nvk\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.954672 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"813bd83338494c522faae43f1e0402e8a901e33aa01d4b271ae92aad2c815a69"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.955237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"5eb0fcb44b07f5244c0d9d2779087953242c28489c6f40f225ca219a8858a4aa"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.955252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"3a3c0ec2030a8fe1b332e11370fb8ca4d41f89c20a0ac716a4d4a7d74264ec0a"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.957231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-96ae-account-create-qxl8c" event={"ID":"33eb2486-60b1-4fc5-aeeb-1a1855693079","Type":"ContainerDied","Data":"90b6e4b6a3010c831f6c16dfc008d0f2a391737d61946664632a35120a9f86fa"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.957272 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-96ae-account-create-qxl8c" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.957298 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b6e4b6a3010c831f6c16dfc008d0f2a391737d61946664632a35120a9f86fa" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.958644 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d256-account-create-q8d4t" event={"ID":"9a07282f-11f0-40b7-af8f-32fd266b70de","Type":"ContainerDied","Data":"c37c9e19130e4c9b6026091228a7af7123a61e80171c9bd96606e78734766288"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.958695 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c37c9e19130e4c9b6026091228a7af7123a61e80171c9bd96606e78734766288" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.958665 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d256-account-create-q8d4t" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.960830 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qkhg2" event={"ID":"c5fd4c34-4f7d-493b-b792-d3e20b82d5cc","Type":"ContainerDied","Data":"53565c5d30defcc5c82a64d450d377b9b51a622f013d433e64d636753e3c8009"} Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.960862 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53565c5d30defcc5c82a64d450d377b9b51a622f013d433e64d636753e3c8009" Oct 14 13:16:43 crc kubenswrapper[4837]: I1014 13:16:43.960978 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qkhg2" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.203103 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.268730 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.319204 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs6vs\" (UniqueName: \"kubernetes.io/projected/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c-kube-api-access-rs6vs\") pod \"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c\" (UID: \"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c\") " Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.326447 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c-kube-api-access-rs6vs" (OuterVolumeSpecName: "kube-api-access-rs6vs") pod "ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c" (UID: "ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c"). InnerVolumeSpecName "kube-api-access-rs6vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.421510 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55v6g\" (UniqueName: \"kubernetes.io/projected/266f6dd3-65a1-49e5-a904-66ed929e8718-kube-api-access-55v6g\") pod \"266f6dd3-65a1-49e5-a904-66ed929e8718\" (UID: \"266f6dd3-65a1-49e5-a904-66ed929e8718\") " Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.422016 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs6vs\" (UniqueName: \"kubernetes.io/projected/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c-kube-api-access-rs6vs\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.428304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/266f6dd3-65a1-49e5-a904-66ed929e8718-kube-api-access-55v6g" (OuterVolumeSpecName: "kube-api-access-55v6g") pod "266f6dd3-65a1-49e5-a904-66ed929e8718" (UID: "266f6dd3-65a1-49e5-a904-66ed929e8718"). InnerVolumeSpecName "kube-api-access-55v6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.523496 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55v6g\" (UniqueName: \"kubernetes.io/projected/266f6dd3-65a1-49e5-a904-66ed929e8718-kube-api-access-55v6g\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.988888 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5n9j7" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.988904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5n9j7" event={"ID":"266f6dd3-65a1-49e5-a904-66ed929e8718","Type":"ContainerDied","Data":"00b0b33b43cd4818409e230e593caddb6fbc754005fbea2831da67f165d97784"} Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.989328 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b0b33b43cd4818409e230e593caddb6fbc754005fbea2831da67f165d97784" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.990403 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7b2p6" event={"ID":"ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c","Type":"ContainerDied","Data":"476c778ff760f88a84d49d73dfb6a6d884bf61802902c3ebda8ff8178d2b36ba"} Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.990427 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476c778ff760f88a84d49d73dfb6a6d884bf61802902c3ebda8ff8178d2b36ba" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.990475 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7b2p6" Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.997784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"4c5c112b875f4f0331197b7f8559802a87ee1172fdae0cbfa347ab74a74a627d"} Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.997827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"11092f8a685b3d94194c4b0e9a6fcc7e8576fd90afe3b84edc8da9f008522f32"} Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.997840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"0ec547036858056227ffd846463ef8a341f0fa2b3ef0739e75aee92855521020"} Oct 14 13:16:44 crc kubenswrapper[4837]: I1014 13:16:44.997907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d34918e7-1e17-4d1d-a163-4d2f0539f2d7","Type":"ContainerStarted","Data":"80429c2cfd260fa6534754576ad9cd9d7be79eaf94728ad8b081076978f6aaef"} Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.059754 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.760549855 podStartE2EDuration="52.059734125s" podCreationTimestamp="2025-10-14 13:15:53 +0000 UTC" firstStartedPulling="2025-10-14 13:16:27.697314187 +0000 UTC m=+925.614314000" lastFinishedPulling="2025-10-14 13:16:42.996498457 +0000 UTC m=+940.913498270" observedRunningTime="2025-10-14 13:16:45.054415279 +0000 UTC m=+942.971415142" watchObservedRunningTime="2025-10-14 13:16:45.059734125 +0000 UTC m=+942.976733948" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.314848 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-7dgrl"] Oct 14 13:16:45 crc kubenswrapper[4837]: E1014 13:16:45.315255 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266f6dd3-65a1-49e5-a904-66ed929e8718" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315279 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="266f6dd3-65a1-49e5-a904-66ed929e8718" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: E1014 13:16:45.315304 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fd4c34-4f7d-493b-b792-d3e20b82d5cc" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315314 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fd4c34-4f7d-493b-b792-d3e20b82d5cc" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: E1014 13:16:45.315340 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315349 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: E1014 13:16:45.315364 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33eb2486-60b1-4fc5-aeeb-1a1855693079" containerName="mariadb-account-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33eb2486-60b1-4fc5-aeeb-1a1855693079" containerName="mariadb-account-create" Oct 14 13:16:45 crc kubenswrapper[4837]: E1014 13:16:45.315413 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a07282f-11f0-40b7-af8f-32fd266b70de" containerName="mariadb-account-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315423 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a07282f-11f0-40b7-af8f-32fd266b70de" containerName="mariadb-account-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315617 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fd4c34-4f7d-493b-b792-d3e20b82d5cc" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315649 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="33eb2486-60b1-4fc5-aeeb-1a1855693079" containerName="mariadb-account-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315666 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315678 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="266f6dd3-65a1-49e5-a904-66ed929e8718" containerName="mariadb-database-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.315691 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a07282f-11f0-40b7-af8f-32fd266b70de" containerName="mariadb-account-create" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.316580 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.318484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.379137 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-7dgrl"] Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.438754 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-config\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.438830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.438994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.439264 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c87r\" (UniqueName: \"kubernetes.io/projected/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-kube-api-access-2c87r\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.439399 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.439445 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.540558 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c87r\" (UniqueName: \"kubernetes.io/projected/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-kube-api-access-2c87r\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.540621 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.540647 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.540731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-config\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.540783 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.540826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.541655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.541853 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.541945 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.541973 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.542745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-config\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.562334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c87r\" (UniqueName: \"kubernetes.io/projected/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-kube-api-access-2c87r\") pod \"dnsmasq-dns-6d5b6d6b67-7dgrl\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:45 crc kubenswrapper[4837]: I1014 13:16:45.632453 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:46 crc kubenswrapper[4837]: I1014 13:16:46.085835 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-7dgrl"] Oct 14 13:16:46 crc kubenswrapper[4837]: W1014 13:16:46.097687 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c650b7f_2dc5_4dde_82b2_1765fb5bf2f4.slice/crio-bc24c042c0fba3179d467228147fb266ec2d4a72a7a3e34d66005d1797ccc4c8 WatchSource:0}: Error finding container bc24c042c0fba3179d467228147fb266ec2d4a72a7a3e34d66005d1797ccc4c8: Status 404 returned error can't find the container with id bc24c042c0fba3179d467228147fb266ec2d4a72a7a3e34d66005d1797ccc4c8 Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.015688 4837 generic.go:334] "Generic (PLEG): container finished" podID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerID="4077b23ba30ca10ac66ee37b13c21428591f6fb4dd2eef7698b93fee56310245" exitCode=0 Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.015751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" event={"ID":"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4","Type":"ContainerDied","Data":"4077b23ba30ca10ac66ee37b13c21428591f6fb4dd2eef7698b93fee56310245"} Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.016213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" event={"ID":"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4","Type":"ContainerStarted","Data":"bc24c042c0fba3179d467228147fb266ec2d4a72a7a3e34d66005d1797ccc4c8"} Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.378003 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wlzpt"] Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.380367 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.383061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.383512 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.383602 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6z55" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.383964 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.387708 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wlzpt"] Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.472669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27htr\" (UniqueName: \"kubernetes.io/projected/6b72d2be-c361-496e-9ea4-990b2e8f15a3-kube-api-access-27htr\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.472814 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-combined-ca-bundle\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.472897 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-config-data\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.573949 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-combined-ca-bundle\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.574045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-config-data\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.574069 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27htr\" (UniqueName: \"kubernetes.io/projected/6b72d2be-c361-496e-9ea4-990b2e8f15a3-kube-api-access-27htr\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.580984 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-combined-ca-bundle\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.591969 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-config-data\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.612902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27htr\" (UniqueName: \"kubernetes.io/projected/6b72d2be-c361-496e-9ea4-990b2e8f15a3-kube-api-access-27htr\") pod \"keystone-db-sync-wlzpt\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:47 crc kubenswrapper[4837]: I1014 13:16:47.701693 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:16:48 crc kubenswrapper[4837]: I1014 13:16:48.026454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" event={"ID":"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4","Type":"ContainerStarted","Data":"91955c4a91227dbea9570a993160a537c157f36589ea153fd738e9fd0d017547"} Oct 14 13:16:48 crc kubenswrapper[4837]: I1014 13:16:48.026882 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:48 crc kubenswrapper[4837]: I1014 13:16:48.051331 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" podStartSLOduration=3.051309813 podStartE2EDuration="3.051309813s" podCreationTimestamp="2025-10-14 13:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:16:48.044858797 +0000 UTC m=+945.961858630" watchObservedRunningTime="2025-10-14 13:16:48.051309813 +0000 UTC m=+945.968309626" Oct 14 13:16:48 crc kubenswrapper[4837]: I1014 13:16:48.165897 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wlzpt"] Oct 14 13:16:48 crc kubenswrapper[4837]: W1014 13:16:48.168094 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b72d2be_c361_496e_9ea4_990b2e8f15a3.slice/crio-632fedaf8f4c786fd778553e74c89898ae28b8ea21781d9827f3dbe8808e63bf WatchSource:0}: Error finding container 632fedaf8f4c786fd778553e74c89898ae28b8ea21781d9827f3dbe8808e63bf: Status 404 returned error can't find the container with id 632fedaf8f4c786fd778553e74c89898ae28b8ea21781d9827f3dbe8808e63bf Oct 14 13:16:49 crc kubenswrapper[4837]: I1014 13:16:49.039728 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlzpt" event={"ID":"6b72d2be-c361-496e-9ea4-990b2e8f15a3","Type":"ContainerStarted","Data":"632fedaf8f4c786fd778553e74c89898ae28b8ea21781d9827f3dbe8808e63bf"} Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.048334 4837 generic.go:334] "Generic (PLEG): container finished" podID="23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" containerID="e4a55f3394f1cb547aa6a393bc2b03463e445b7d3398a6c16bad0f18eb331be1" exitCode=0 Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.048376 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f7lcc" event={"ID":"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc","Type":"ContainerDied","Data":"e4a55f3394f1cb547aa6a393bc2b03463e445b7d3398a6c16bad0f18eb331be1"} Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.849829 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-398e-account-create-2f6jz"] Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.851242 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.858244 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-398e-account-create-2f6jz"] Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.858508 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 14 13:16:50 crc kubenswrapper[4837]: I1014 13:16:50.937732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtgln\" (UniqueName: \"kubernetes.io/projected/55a9e520-b0fd-4651-9606-cbd533bea731-kube-api-access-wtgln\") pod \"cinder-398e-account-create-2f6jz\" (UID: \"55a9e520-b0fd-4651-9606-cbd533bea731\") " pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.049107 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtgln\" (UniqueName: \"kubernetes.io/projected/55a9e520-b0fd-4651-9606-cbd533bea731-kube-api-access-wtgln\") pod \"cinder-398e-account-create-2f6jz\" (UID: \"55a9e520-b0fd-4651-9606-cbd533bea731\") " pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.057274 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-eb27-account-create-mn6ht"] Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.058765 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.065733 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.069052 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-eb27-account-create-mn6ht"] Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.085862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtgln\" (UniqueName: \"kubernetes.io/projected/55a9e520-b0fd-4651-9606-cbd533bea731-kube-api-access-wtgln\") pod \"cinder-398e-account-create-2f6jz\" (UID: \"55a9e520-b0fd-4651-9606-cbd533bea731\") " pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.147568 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9d23-account-create-kjvbn"] Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.149022 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.152501 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5lb5\" (UniqueName: \"kubernetes.io/projected/e46f2d2f-de1a-48e3-b6dd-82e2780ac592-kube-api-access-k5lb5\") pod \"barbican-eb27-account-create-mn6ht\" (UID: \"e46f2d2f-de1a-48e3-b6dd-82e2780ac592\") " pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.152820 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.158436 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d23-account-create-kjvbn"] Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.177455 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.254443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5lb5\" (UniqueName: \"kubernetes.io/projected/e46f2d2f-de1a-48e3-b6dd-82e2780ac592-kube-api-access-k5lb5\") pod \"barbican-eb27-account-create-mn6ht\" (UID: \"e46f2d2f-de1a-48e3-b6dd-82e2780ac592\") " pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.254523 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkxlx\" (UniqueName: \"kubernetes.io/projected/88ae92d9-bc5d-46a1-b9fb-68b3482dca91-kube-api-access-jkxlx\") pod \"neutron-9d23-account-create-kjvbn\" (UID: \"88ae92d9-bc5d-46a1-b9fb-68b3482dca91\") " pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.297539 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5lb5\" (UniqueName: \"kubernetes.io/projected/e46f2d2f-de1a-48e3-b6dd-82e2780ac592-kube-api-access-k5lb5\") pod \"barbican-eb27-account-create-mn6ht\" (UID: \"e46f2d2f-de1a-48e3-b6dd-82e2780ac592\") " pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.356749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkxlx\" (UniqueName: \"kubernetes.io/projected/88ae92d9-bc5d-46a1-b9fb-68b3482dca91-kube-api-access-jkxlx\") pod \"neutron-9d23-account-create-kjvbn\" (UID: \"88ae92d9-bc5d-46a1-b9fb-68b3482dca91\") " pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.374923 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkxlx\" (UniqueName: \"kubernetes.io/projected/88ae92d9-bc5d-46a1-b9fb-68b3482dca91-kube-api-access-jkxlx\") pod \"neutron-9d23-account-create-kjvbn\" (UID: \"88ae92d9-bc5d-46a1-b9fb-68b3482dca91\") " pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.394045 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:51 crc kubenswrapper[4837]: I1014 13:16:51.474590 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.402575 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.473470 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-db-sync-config-data\") pod \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.473798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-config-data\") pod \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.473853 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-combined-ca-bundle\") pod \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.473893 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg678\" (UniqueName: \"kubernetes.io/projected/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-kube-api-access-vg678\") pod \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\" (UID: \"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc\") " Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.481640 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" (UID: "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.485007 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-kube-api-access-vg678" (OuterVolumeSpecName: "kube-api-access-vg678") pod "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" (UID: "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc"). InnerVolumeSpecName "kube-api-access-vg678". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.512239 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" (UID: "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.519763 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-config-data" (OuterVolumeSpecName: "config-data") pod "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" (UID: "23ad4da0-18fa-4eba-990c-4c9c80d4ecdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.576408 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.576465 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg678\" (UniqueName: \"kubernetes.io/projected/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-kube-api-access-vg678\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.576481 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.576493 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.634262 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d23-account-create-kjvbn"] Oct 14 13:16:52 crc kubenswrapper[4837]: W1014 13:16:52.637555 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ae92d9_bc5d_46a1_b9fb_68b3482dca91.slice/crio-366ff4a5389f4fe134961a846745261e2c112630b56451a75a2d1740f48d9cdf WatchSource:0}: Error finding container 366ff4a5389f4fe134961a846745261e2c112630b56451a75a2d1740f48d9cdf: Status 404 returned error can't find the container with id 366ff4a5389f4fe134961a846745261e2c112630b56451a75a2d1740f48d9cdf Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.737414 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-eb27-account-create-mn6ht"] Oct 14 13:16:52 crc kubenswrapper[4837]: I1014 13:16:52.744221 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-398e-account-create-2f6jz"] Oct 14 13:16:52 crc kubenswrapper[4837]: W1014 13:16:52.744457 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode46f2d2f_de1a_48e3_b6dd_82e2780ac592.slice/crio-dc9cf5a6b6d5b1734929998caf3e68ad47d110949288c2acc4fc5f48c92281d2 WatchSource:0}: Error finding container dc9cf5a6b6d5b1734929998caf3e68ad47d110949288c2acc4fc5f48c92281d2: Status 404 returned error can't find the container with id dc9cf5a6b6d5b1734929998caf3e68ad47d110949288c2acc4fc5f48c92281d2 Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.087568 4837 generic.go:334] "Generic (PLEG): container finished" podID="e46f2d2f-de1a-48e3-b6dd-82e2780ac592" containerID="710c9649811fecf9d5f228584d4381aa08581192d962521ec584b60547009d3a" exitCode=0 Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.087696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-eb27-account-create-mn6ht" event={"ID":"e46f2d2f-de1a-48e3-b6dd-82e2780ac592","Type":"ContainerDied","Data":"710c9649811fecf9d5f228584d4381aa08581192d962521ec584b60547009d3a"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.087908 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-eb27-account-create-mn6ht" event={"ID":"e46f2d2f-de1a-48e3-b6dd-82e2780ac592","Type":"ContainerStarted","Data":"dc9cf5a6b6d5b1734929998caf3e68ad47d110949288c2acc4fc5f48c92281d2"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.089944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f7lcc" event={"ID":"23ad4da0-18fa-4eba-990c-4c9c80d4ecdc","Type":"ContainerDied","Data":"b60c7c572ba69b254ae6df7203496ae42f7f50af272707912f7654cfd25817eb"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.090088 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b60c7c572ba69b254ae6df7203496ae42f7f50af272707912f7654cfd25817eb" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.090247 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f7lcc" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.094238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlzpt" event={"ID":"6b72d2be-c361-496e-9ea4-990b2e8f15a3","Type":"ContainerStarted","Data":"38ba2d4649e5a96fe11d8a1cf49a7469c6fb3ed7e29fa35d236852eacac935c0"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.098079 4837 generic.go:334] "Generic (PLEG): container finished" podID="55a9e520-b0fd-4651-9606-cbd533bea731" containerID="76b2d7a45c476911b83b1f90b813862ff9bf439f3b2c9fcf10f96f676a19071f" exitCode=0 Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.098278 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-398e-account-create-2f6jz" event={"ID":"55a9e520-b0fd-4651-9606-cbd533bea731","Type":"ContainerDied","Data":"76b2d7a45c476911b83b1f90b813862ff9bf439f3b2c9fcf10f96f676a19071f"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.098316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-398e-account-create-2f6jz" event={"ID":"55a9e520-b0fd-4651-9606-cbd533bea731","Type":"ContainerStarted","Data":"be842895fb43be2419ec5f14c5429a7fbb6ec7467a4363e0978d3846adeff79a"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.100376 4837 generic.go:334] "Generic (PLEG): container finished" podID="88ae92d9-bc5d-46a1-b9fb-68b3482dca91" containerID="733b1ee8462fb43cd7252bc85846af597025216a00c167e7ed28ee04e0ec454e" exitCode=0 Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.100460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d23-account-create-kjvbn" event={"ID":"88ae92d9-bc5d-46a1-b9fb-68b3482dca91","Type":"ContainerDied","Data":"733b1ee8462fb43cd7252bc85846af597025216a00c167e7ed28ee04e0ec454e"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.100481 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d23-account-create-kjvbn" event={"ID":"88ae92d9-bc5d-46a1-b9fb-68b3482dca91","Type":"ContainerStarted","Data":"366ff4a5389f4fe134961a846745261e2c112630b56451a75a2d1740f48d9cdf"} Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.136052 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wlzpt" podStartSLOduration=1.992239418 podStartE2EDuration="6.136032417s" podCreationTimestamp="2025-10-14 13:16:47 +0000 UTC" firstStartedPulling="2025-10-14 13:16:48.171204791 +0000 UTC m=+946.088204594" lastFinishedPulling="2025-10-14 13:16:52.31499778 +0000 UTC m=+950.231997593" observedRunningTime="2025-10-14 13:16:53.134043062 +0000 UTC m=+951.051042885" watchObservedRunningTime="2025-10-14 13:16:53.136032417 +0000 UTC m=+951.053032230" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.714975 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-7dgrl"] Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.715330 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerName="dnsmasq-dns" containerID="cri-o://91955c4a91227dbea9570a993160a537c157f36589ea153fd738e9fd0d017547" gracePeriod=10 Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.719456 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.748580 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-cntf6"] Oct 14 13:16:53 crc kubenswrapper[4837]: E1014 13:16:53.748923 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" containerName="glance-db-sync" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.748937 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" containerName="glance-db-sync" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.749102 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" containerName="glance-db-sync" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.749902 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.765963 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-cntf6"] Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.900828 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.900922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-config\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.901022 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-svc\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.901042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.901071 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qqt\" (UniqueName: \"kubernetes.io/projected/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-kube-api-access-74qqt\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:53 crc kubenswrapper[4837]: I1014 13:16:53.901115 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.002188 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-config\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.002264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-svc\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.002292 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.002326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qqt\" (UniqueName: \"kubernetes.io/projected/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-kube-api-access-74qqt\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.002381 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.002466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.003790 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.003918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.003922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-svc\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.005565 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-config\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.006638 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.025230 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qqt\" (UniqueName: \"kubernetes.io/projected/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-kube-api-access-74qqt\") pod \"dnsmasq-dns-895cf5cf-cntf6\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.076536 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.112262 4837 generic.go:334] "Generic (PLEG): container finished" podID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerID="91955c4a91227dbea9570a993160a537c157f36589ea153fd738e9fd0d017547" exitCode=0 Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.112310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" event={"ID":"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4","Type":"ContainerDied","Data":"91955c4a91227dbea9570a993160a537c157f36589ea153fd738e9fd0d017547"} Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.112363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" event={"ID":"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4","Type":"ContainerDied","Data":"bc24c042c0fba3179d467228147fb266ec2d4a72a7a3e34d66005d1797ccc4c8"} Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.112379 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc24c042c0fba3179d467228147fb266ec2d4a72a7a3e34d66005d1797ccc4c8" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.171993 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.381884 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-swift-storage-0\") pod \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.382290 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-sb\") pod \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.382325 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c87r\" (UniqueName: \"kubernetes.io/projected/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-kube-api-access-2c87r\") pod \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.382413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-nb\") pod \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.382436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-svc\") pod \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.382475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-config\") pod \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\" (UID: \"8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4\") " Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.390758 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-kube-api-access-2c87r" (OuterVolumeSpecName: "kube-api-access-2c87r") pod "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" (UID: "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4"). InnerVolumeSpecName "kube-api-access-2c87r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.423965 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" (UID: "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.434790 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" (UID: "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.441453 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" (UID: "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.450317 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-config" (OuterVolumeSpecName: "config") pod "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" (UID: "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.454102 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" (UID: "8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.488142 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c87r\" (UniqueName: \"kubernetes.io/projected/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-kube-api-access-2c87r\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.488189 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.488201 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.488220 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.488232 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:54 crc kubenswrapper[4837]: I1014 13:16:54.488243 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.730077 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.747806 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.770038 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.770569 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-cntf6"] Oct 14 13:16:56 crc kubenswrapper[4837]: W1014 13:16:54.790432 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2188251b_90b0_46fe_9bb2_5b16cd3d1dac.slice/crio-f0790595b0d8578b063d709db9a141bead75efb47096618555a636ce11f155d2 WatchSource:0}: Error finding container f0790595b0d8578b063d709db9a141bead75efb47096618555a636ce11f155d2: Status 404 returned error can't find the container with id f0790595b0d8578b063d709db9a141bead75efb47096618555a636ce11f155d2 Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.893833 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5lb5\" (UniqueName: \"kubernetes.io/projected/e46f2d2f-de1a-48e3-b6dd-82e2780ac592-kube-api-access-k5lb5\") pod \"e46f2d2f-de1a-48e3-b6dd-82e2780ac592\" (UID: \"e46f2d2f-de1a-48e3-b6dd-82e2780ac592\") " Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.893999 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtgln\" (UniqueName: \"kubernetes.io/projected/55a9e520-b0fd-4651-9606-cbd533bea731-kube-api-access-wtgln\") pod \"55a9e520-b0fd-4651-9606-cbd533bea731\" (UID: \"55a9e520-b0fd-4651-9606-cbd533bea731\") " Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.894064 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkxlx\" (UniqueName: \"kubernetes.io/projected/88ae92d9-bc5d-46a1-b9fb-68b3482dca91-kube-api-access-jkxlx\") pod \"88ae92d9-bc5d-46a1-b9fb-68b3482dca91\" (UID: \"88ae92d9-bc5d-46a1-b9fb-68b3482dca91\") " Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.898282 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ae92d9-bc5d-46a1-b9fb-68b3482dca91-kube-api-access-jkxlx" (OuterVolumeSpecName: "kube-api-access-jkxlx") pod "88ae92d9-bc5d-46a1-b9fb-68b3482dca91" (UID: "88ae92d9-bc5d-46a1-b9fb-68b3482dca91"). InnerVolumeSpecName "kube-api-access-jkxlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.899571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46f2d2f-de1a-48e3-b6dd-82e2780ac592-kube-api-access-k5lb5" (OuterVolumeSpecName: "kube-api-access-k5lb5") pod "e46f2d2f-de1a-48e3-b6dd-82e2780ac592" (UID: "e46f2d2f-de1a-48e3-b6dd-82e2780ac592"). InnerVolumeSpecName "kube-api-access-k5lb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:54.906809 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a9e520-b0fd-4651-9606-cbd533bea731-kube-api-access-wtgln" (OuterVolumeSpecName: "kube-api-access-wtgln") pod "55a9e520-b0fd-4651-9606-cbd533bea731" (UID: "55a9e520-b0fd-4651-9606-cbd533bea731"). InnerVolumeSpecName "kube-api-access-wtgln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.011677 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5lb5\" (UniqueName: \"kubernetes.io/projected/e46f2d2f-de1a-48e3-b6dd-82e2780ac592-kube-api-access-k5lb5\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.011715 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtgln\" (UniqueName: \"kubernetes.io/projected/55a9e520-b0fd-4651-9606-cbd533bea731-kube-api-access-wtgln\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.011728 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkxlx\" (UniqueName: \"kubernetes.io/projected/88ae92d9-bc5d-46a1-b9fb-68b3482dca91-kube-api-access-jkxlx\") on node \"crc\" DevicePath \"\"" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.120869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-398e-account-create-2f6jz" event={"ID":"55a9e520-b0fd-4651-9606-cbd533bea731","Type":"ContainerDied","Data":"be842895fb43be2419ec5f14c5429a7fbb6ec7467a4363e0978d3846adeff79a"} Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.120914 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be842895fb43be2419ec5f14c5429a7fbb6ec7467a4363e0978d3846adeff79a" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.120961 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-398e-account-create-2f6jz" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.125221 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d23-account-create-kjvbn" event={"ID":"88ae92d9-bc5d-46a1-b9fb-68b3482dca91","Type":"ContainerDied","Data":"366ff4a5389f4fe134961a846745261e2c112630b56451a75a2d1740f48d9cdf"} Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.125263 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="366ff4a5389f4fe134961a846745261e2c112630b56451a75a2d1740f48d9cdf" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.125344 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d23-account-create-kjvbn" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.126534 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" event={"ID":"2188251b-90b0-46fe-9bb2-5b16cd3d1dac","Type":"ContainerStarted","Data":"f0790595b0d8578b063d709db9a141bead75efb47096618555a636ce11f155d2"} Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.128402 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-eb27-account-create-mn6ht" event={"ID":"e46f2d2f-de1a-48e3-b6dd-82e2780ac592","Type":"ContainerDied","Data":"dc9cf5a6b6d5b1734929998caf3e68ad47d110949288c2acc4fc5f48c92281d2"} Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.128448 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc9cf5a6b6d5b1734929998caf3e68ad47d110949288c2acc4fc5f48c92281d2" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.128425 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eb27-account-create-mn6ht" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.128425 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-7dgrl" Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.155531 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-7dgrl"] Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:55.161622 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-7dgrl"] Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:56.138849 4837 generic.go:334] "Generic (PLEG): container finished" podID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerID="2298cf1400e8ce4833997890ed56d35a017ddf8819730c49a57ddaa49e737262" exitCode=0 Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:56.139305 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" event={"ID":"2188251b-90b0-46fe-9bb2-5b16cd3d1dac","Type":"ContainerDied","Data":"2298cf1400e8ce4833997890ed56d35a017ddf8819730c49a57ddaa49e737262"} Oct 14 13:16:56 crc kubenswrapper[4837]: I1014 13:16:56.795028 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" path="/var/lib/kubelet/pods/8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4/volumes" Oct 14 13:16:57 crc kubenswrapper[4837]: I1014 13:16:57.148564 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" event={"ID":"2188251b-90b0-46fe-9bb2-5b16cd3d1dac","Type":"ContainerStarted","Data":"644a9b4c376b82629ca8c5239b44064a1e85def5436830e1785a2d4ea653b902"} Oct 14 13:16:57 crc kubenswrapper[4837]: I1014 13:16:57.150506 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:16:57 crc kubenswrapper[4837]: I1014 13:16:57.167401 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podStartSLOduration=4.167384812 podStartE2EDuration="4.167384812s" podCreationTimestamp="2025-10-14 13:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:16:57.165024488 +0000 UTC m=+955.082024301" watchObservedRunningTime="2025-10-14 13:16:57.167384812 +0000 UTC m=+955.084384625" Oct 14 13:17:02 crc kubenswrapper[4837]: I1014 13:17:02.193194 4837 generic.go:334] "Generic (PLEG): container finished" podID="6b72d2be-c361-496e-9ea4-990b2e8f15a3" containerID="38ba2d4649e5a96fe11d8a1cf49a7469c6fb3ed7e29fa35d236852eacac935c0" exitCode=0 Oct 14 13:17:02 crc kubenswrapper[4837]: I1014 13:17:02.193273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlzpt" event={"ID":"6b72d2be-c361-496e-9ea4-990b2e8f15a3","Type":"ContainerDied","Data":"38ba2d4649e5a96fe11d8a1cf49a7469c6fb3ed7e29fa35d236852eacac935c0"} Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.558288 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.709727 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-combined-ca-bundle\") pod \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.709988 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-config-data\") pod \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.710067 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27htr\" (UniqueName: \"kubernetes.io/projected/6b72d2be-c361-496e-9ea4-990b2e8f15a3-kube-api-access-27htr\") pod \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\" (UID: \"6b72d2be-c361-496e-9ea4-990b2e8f15a3\") " Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.716646 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b72d2be-c361-496e-9ea4-990b2e8f15a3-kube-api-access-27htr" (OuterVolumeSpecName: "kube-api-access-27htr") pod "6b72d2be-c361-496e-9ea4-990b2e8f15a3" (UID: "6b72d2be-c361-496e-9ea4-990b2e8f15a3"). InnerVolumeSpecName "kube-api-access-27htr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.739234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b72d2be-c361-496e-9ea4-990b2e8f15a3" (UID: "6b72d2be-c361-496e-9ea4-990b2e8f15a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.768639 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-config-data" (OuterVolumeSpecName: "config-data") pod "6b72d2be-c361-496e-9ea4-990b2e8f15a3" (UID: "6b72d2be-c361-496e-9ea4-990b2e8f15a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.812340 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27htr\" (UniqueName: \"kubernetes.io/projected/6b72d2be-c361-496e-9ea4-990b2e8f15a3-kube-api-access-27htr\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.812380 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:03 crc kubenswrapper[4837]: I1014 13:17:03.812394 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b72d2be-c361-496e-9ea4-990b2e8f15a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.078316 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.146419 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6svgj"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.146687 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerName="dnsmasq-dns" containerID="cri-o://ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b" gracePeriod=10 Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.218340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlzpt" event={"ID":"6b72d2be-c361-496e-9ea4-990b2e8f15a3","Type":"ContainerDied","Data":"632fedaf8f4c786fd778553e74c89898ae28b8ea21781d9827f3dbe8808e63bf"} Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.218387 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="632fedaf8f4c786fd778553e74c89898ae28b8ea21781d9827f3dbe8808e63bf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.218461 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlzpt" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.453309 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dxj56"] Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.454016 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerName="dnsmasq-dns" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454032 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerName="dnsmasq-dns" Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.454047 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ae92d9-bc5d-46a1-b9fb-68b3482dca91" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454055 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ae92d9-bc5d-46a1-b9fb-68b3482dca91" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.454071 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a9e520-b0fd-4651-9606-cbd533bea731" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454080 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a9e520-b0fd-4651-9606-cbd533bea731" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.454098 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46f2d2f-de1a-48e3-b6dd-82e2780ac592" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454106 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46f2d2f-de1a-48e3-b6dd-82e2780ac592" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.454115 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b72d2be-c361-496e-9ea4-990b2e8f15a3" containerName="keystone-db-sync" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454123 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b72d2be-c361-496e-9ea4-990b2e8f15a3" containerName="keystone-db-sync" Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.454140 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerName="init" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454147 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerName="init" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454354 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46f2d2f-de1a-48e3-b6dd-82e2780ac592" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454372 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ae92d9-bc5d-46a1-b9fb-68b3482dca91" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454393 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c650b7f-2dc5-4dde-82b2-1765fb5bf2f4" containerName="dnsmasq-dns" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454405 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b72d2be-c361-496e-9ea4-990b2e8f15a3" containerName="keystone-db-sync" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.454420 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a9e520-b0fd-4651-9606-cbd533bea731" containerName="mariadb-account-create" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.456911 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.474614 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dxj56"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.504001 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k4xk5"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.505120 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.507641 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.508139 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.508307 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.509050 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6z55" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.518669 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k4xk5"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.624320 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.627659 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-fernet-keys\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.627947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-combined-ca-bundle\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628047 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-scripts\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628328 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-config\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628408 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628624 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-config-data\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628702 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-credential-keys\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628893 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24fb6\" (UniqueName: \"kubernetes.io/projected/c7de719a-1c3d-44eb-ab8b-7c260977d93c-kube-api-access-24fb6\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.628969 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htzkf\" (UniqueName: \"kubernetes.io/projected/d02e469c-24c1-4d5f-af08-d3974395e1a0-kube-api-access-htzkf\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.629066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.687462 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-5j7sn"] Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.688050 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerName="dnsmasq-dns" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.688136 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerName="dnsmasq-dns" Oct 14 13:17:04 crc kubenswrapper[4837]: E1014 13:17:04.688239 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerName="init" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.688305 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerName="init" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.688616 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerName="dnsmasq-dns" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.702432 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.732793 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-sb\") pod \"bdb9c373-ac68-49f9-876d-6835e623ff5f\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.732849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-config\") pod \"bdb9c373-ac68-49f9-876d-6835e623ff5f\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.732917 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-dns-svc\") pod \"bdb9c373-ac68-49f9-876d-6835e623ff5f\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.732954 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w84r\" (UniqueName: \"kubernetes.io/projected/bdb9c373-ac68-49f9-876d-6835e623ff5f-kube-api-access-2w84r\") pod \"bdb9c373-ac68-49f9-876d-6835e623ff5f\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733001 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-nb\") pod \"bdb9c373-ac68-49f9-876d-6835e623ff5f\" (UID: \"bdb9c373-ac68-49f9-876d-6835e623ff5f\") " Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733689 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-config-data\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-credential-keys\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24fb6\" (UniqueName: \"kubernetes.io/projected/c7de719a-1c3d-44eb-ab8b-7c260977d93c-kube-api-access-24fb6\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733921 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htzkf\" (UniqueName: \"kubernetes.io/projected/d02e469c-24c1-4d5f-af08-d3974395e1a0-kube-api-access-htzkf\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.733974 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.734045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-combined-ca-bundle\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.734073 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-fernet-keys\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.734109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-scripts\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.734174 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.734210 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-config\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.736539 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.745427 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-j45rh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.745923 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.750628 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.762303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.764522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.764947 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-config\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.766551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-credential-keys\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.776854 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-fernet-keys\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.780058 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.781936 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9998548fc-h9brf"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.862458 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5j7sn"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.863345 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-config-data\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.863661 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.863792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb9c373-ac68-49f9-876d-6835e623ff5f-kube-api-access-2w84r" (OuterVolumeSpecName: "kube-api-access-2w84r") pod "bdb9c373-ac68-49f9-876d-6835e623ff5f" (UID: "bdb9c373-ac68-49f9-876d-6835e623ff5f"). InnerVolumeSpecName "kube-api-access-2w84r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.864275 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-scripts\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.867093 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-combined-ca-bundle\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877358 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-combined-ca-bundle\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vcc\" (UniqueName: \"kubernetes.io/projected/14d2b3ef-eed6-48cb-948b-3618d6f53fff-kube-api-access-x7vcc\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-db-sync-config-data\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-scripts\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877512 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14d2b3ef-eed6-48cb-948b-3618d6f53fff-etc-machine-id\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877536 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-config-data\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877666 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w84r\" (UniqueName: \"kubernetes.io/projected/bdb9c373-ac68-49f9-876d-6835e623ff5f-kube-api-access-2w84r\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.877924 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-khb8b" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.878054 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.883204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htzkf\" (UniqueName: \"kubernetes.io/projected/d02e469c-24c1-4d5f-af08-d3974395e1a0-kube-api-access-htzkf\") pod \"keystone-bootstrap-k4xk5\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.885909 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.886245 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.890697 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24fb6\" (UniqueName: \"kubernetes.io/projected/c7de719a-1c3d-44eb-ab8b-7c260977d93c-kube-api-access-24fb6\") pod \"dnsmasq-dns-6c9c9f998c-dxj56\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.904069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bdb9c373-ac68-49f9-876d-6835e623ff5f" (UID: "bdb9c373-ac68-49f9-876d-6835e623ff5f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.932152 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9998548fc-h9brf"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.932232 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.934722 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-b69bq"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.935877 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.935929 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.947065 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.947121 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b69bq"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.951046 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bdb9c373-ac68-49f9-876d-6835e623ff5f" (UID: "bdb9c373-ac68-49f9-876d-6835e623ff5f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.956547 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.958566 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.959249 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4s92n" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.962069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bdb9c373-ac68-49f9-876d-6835e623ff5f" (UID: "bdb9c373-ac68-49f9-876d-6835e623ff5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.962411 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.963189 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.964577 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jddnh"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.965642 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.968926 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k4rzv" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.969170 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.969282 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.974805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-config" (OuterVolumeSpecName: "config") pod "bdb9c373-ac68-49f9-876d-6835e623ff5f" (UID: "bdb9c373-ac68-49f9-876d-6835e623ff5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.982249 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mwvdq"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.984122 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985265 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jddnh"] Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985778 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-combined-ca-bundle\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985827 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-scripts\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985856 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985879 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-config-data\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-combined-ca-bundle\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvm4\" (UniqueName: \"kubernetes.io/projected/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-kube-api-access-9mvm4\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985959 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-config-data\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.985988 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vcc\" (UniqueName: \"kubernetes.io/projected/14d2b3ef-eed6-48cb-948b-3618d6f53fff-kube-api-access-x7vcc\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.986020 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-db-sync-config-data\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.986070 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-config\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.986104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-combined-ca-bundle\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.986135 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-scripts\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.986189 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-run-httpd\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.989451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kz99\" (UniqueName: \"kubernetes.io/projected/6d33f0a5-b130-4614-9636-fa0d61fa4e11-kube-api-access-6kz99\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.989673 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-logs\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.989765 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-log-httpd\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.989856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14d2b3ef-eed6-48cb-948b-3618d6f53fff-etc-machine-id\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.989965 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14d2b3ef-eed6-48cb-948b-3618d6f53fff-etc-machine-id\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.990705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-config-data\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.991603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-horizon-secret-key\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.991762 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whw4d\" (UniqueName: \"kubernetes.io/projected/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-kube-api-access-whw4d\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.991834 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-logs\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.991902 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-scripts\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.991974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4zv\" (UniqueName: \"kubernetes.io/projected/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-kube-api-access-hn4zv\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.992061 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc76j\" (UniqueName: \"kubernetes.io/projected/58d55e59-7431-474a-a2eb-be646017f3c2-kube-api-access-wc76j\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.992124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.992201 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-combined-ca-bundle\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.992265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-config-data\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.992376 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-scripts\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.993977 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-db-sync-config-data\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.994322 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.994653 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.996531 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.997062 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.997188 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hs978" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.998378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-config-data\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.998576 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:04 crc kubenswrapper[4837]: I1014 13:17:04.999011 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdb9c373-ac68-49f9-876d-6835e623ff5f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.001723 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-combined-ca-bundle\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.012130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-db-sync-config-data\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.022818 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-scripts\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.041799 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mwvdq"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.048012 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vcc\" (UniqueName: \"kubernetes.io/projected/14d2b3ef-eed6-48cb-948b-3618d6f53fff-kube-api-access-x7vcc\") pod \"cinder-db-sync-5j7sn\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.062284 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dxj56"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.063140 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.063768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.093929 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84646c67f7-4sb8f"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.095698 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101145 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-horizon-secret-key\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whw4d\" (UniqueName: \"kubernetes.io/projected/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-kube-api-access-whw4d\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101218 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-logs\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-scripts\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4zv\" (UniqueName: \"kubernetes.io/projected/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-kube-api-access-hn4zv\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101270 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc76j\" (UniqueName: \"kubernetes.io/projected/58d55e59-7431-474a-a2eb-be646017f3c2-kube-api-access-wc76j\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101287 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101349 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-combined-ca-bundle\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101380 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-config-data\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-scripts\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101489 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-db-sync-config-data\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-combined-ca-bundle\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101527 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-scripts\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101540 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-config-data\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101573 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvm4\" (UniqueName: \"kubernetes.io/projected/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-kube-api-access-9mvm4\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-config-data\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101616 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-config\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101634 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-combined-ca-bundle\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101653 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-logs\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-run-httpd\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101747 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kz99\" (UniqueName: \"kubernetes.io/projected/6d33f0a5-b130-4614-9636-fa0d61fa4e11-kube-api-access-6kz99\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101767 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-logs\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.101782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-log-httpd\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.102129 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-log-httpd\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.103373 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-scripts\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.104145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-run-httpd\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.109701 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-db-sync-config-data\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.111296 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-config-data\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.112245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-scripts\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.112721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-logs\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.115820 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-config\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.123212 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84646c67f7-4sb8f"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.126553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-horizon-secret-key\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.126874 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-combined-ca-bundle\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.127893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.134254 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-config-data\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.149877 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-combined-ca-bundle\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.150419 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.150777 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-scripts\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.151649 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-config-data\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.153901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvm4\" (UniqueName: \"kubernetes.io/projected/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-kube-api-access-9mvm4\") pod \"horizon-9998548fc-h9brf\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.155186 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whw4d\" (UniqueName: \"kubernetes.io/projected/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-kube-api-access-whw4d\") pod \"placement-db-sync-jddnh\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.155981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kz99\" (UniqueName: \"kubernetes.io/projected/6d33f0a5-b130-4614-9636-fa0d61fa4e11-kube-api-access-6kz99\") pod \"barbican-db-sync-b69bq\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.164207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4zv\" (UniqueName: \"kubernetes.io/projected/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-kube-api-access-hn4zv\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.164530 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-combined-ca-bundle\") pod \"neutron-db-sync-mwvdq\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.171222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc76j\" (UniqueName: \"kubernetes.io/projected/58d55e59-7431-474a-a2eb-be646017f3c2-kube-api-access-wc76j\") pod \"ceilometer-0\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.180222 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vfpnl"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.181560 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.200252 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203658 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctst5\" (UniqueName: \"kubernetes.io/projected/6c469b81-89a4-4d35-bc9a-b04b82c2571e-kube-api-access-ctst5\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203728 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-config\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203763 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203811 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e6fcb6-6898-40ca-af1d-79a445d128c8-logs\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203827 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-config-data\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203851 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0e6fcb6-6898-40ca-af1d-79a445d128c8-horizon-secret-key\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203876 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6p8t\" (UniqueName: \"kubernetes.io/projected/e0e6fcb6-6898-40ca-af1d-79a445d128c8-kube-api-access-t6p8t\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.203960 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-scripts\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.205421 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.209103 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.209149 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.209107 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.209277 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h5zj9" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.212145 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.248089 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vfpnl"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.265823 4837 generic.go:334] "Generic (PLEG): container finished" podID="bdb9c373-ac68-49f9-876d-6835e623ff5f" containerID="ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b" exitCode=0 Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.266059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" event={"ID":"bdb9c373-ac68-49f9-876d-6835e623ff5f","Type":"ContainerDied","Data":"ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b"} Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.266148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" event={"ID":"bdb9c373-ac68-49f9-876d-6835e623ff5f","Type":"ContainerDied","Data":"20ff5eb59b5151bfddcca7dacee03fe95051ed47cb32f7f443459495e3d81de7"} Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.266244 4837 scope.go:117] "RemoveContainer" containerID="ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.266512 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6svgj" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.281556 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.297638 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b69bq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304563 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304603 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6p8t\" (UniqueName: \"kubernetes.io/projected/e0e6fcb6-6898-40ca-af1d-79a445d128c8-kube-api-access-t6p8t\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304624 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304642 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-scripts\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-scripts\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304684 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-logs\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304742 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctst5\" (UniqueName: \"kubernetes.io/projected/6c469b81-89a4-4d35-bc9a-b04b82c2571e-kube-api-access-ctst5\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304763 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304814 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-config\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304835 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4n7p\" (UniqueName: \"kubernetes.io/projected/9691acb7-d684-4ebe-8e5f-59a97f495a88-kube-api-access-g4n7p\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e6fcb6-6898-40ca-af1d-79a445d128c8-logs\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304909 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-config-data\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0e6fcb6-6898-40ca-af1d-79a445d128c8-horizon-secret-key\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-config-data\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.304966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.305809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.306268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.306983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-scripts\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.307877 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e6fcb6-6898-40ca-af1d-79a445d128c8-logs\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.308484 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.308786 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-config-data\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.309007 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.311461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0e6fcb6-6898-40ca-af1d-79a445d128c8-horizon-secret-key\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.314559 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-config\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.322416 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.325959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctst5\" (UniqueName: \"kubernetes.io/projected/6c469b81-89a4-4d35-bc9a-b04b82c2571e-kube-api-access-ctst5\") pod \"dnsmasq-dns-57c957c4ff-vfpnl\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.342011 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6p8t\" (UniqueName: \"kubernetes.io/projected/e0e6fcb6-6898-40ca-af1d-79a445d128c8-kube-api-access-t6p8t\") pod \"horizon-84646c67f7-4sb8f\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.353594 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jddnh" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.372213 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.384254 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.385638 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.388748 4837 scope.go:117] "RemoveContainer" containerID="7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.388796 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.395017 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.397772 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406213 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406269 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4n7p\" (UniqueName: \"kubernetes.io/projected/9691acb7-d684-4ebe-8e5f-59a97f495a88-kube-api-access-g4n7p\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406354 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-config-data\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406385 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406403 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-scripts\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-logs\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406471 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.406749 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.413101 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-scripts\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.413400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.413789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-logs\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.414856 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.415091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-config-data\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.424362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.446053 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4n7p\" (UniqueName: \"kubernetes.io/projected/9691acb7-d684-4ebe-8e5f-59a97f495a88-kube-api-access-g4n7p\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.457573 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.468042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.505732 4837 scope.go:117] "RemoveContainer" containerID="ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b" Oct 14 13:17:05 crc kubenswrapper[4837]: E1014 13:17:05.506689 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b\": container with ID starting with ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b not found: ID does not exist" containerID="ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.506733 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b"} err="failed to get container status \"ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b\": rpc error: code = NotFound desc = could not find container \"ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b\": container with ID starting with ed958b68c4898210a8d0e2c03157e24b0f03b6c034a3dd95619d60cb8cd81c2b not found: ID does not exist" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.506764 4837 scope.go:117] "RemoveContainer" containerID="7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.508123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwqss\" (UniqueName: \"kubernetes.io/projected/44c72978-1e69-4040-b174-cdc1c9ebe222-kube-api-access-dwqss\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.508182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-logs\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.508221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.508286 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.508332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: E1014 13:17:05.512149 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae\": container with ID starting with 7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae not found: ID does not exist" containerID="7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.512219 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae"} err="failed to get container status \"7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae\": rpc error: code = NotFound desc = could not find container \"7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae\": container with ID starting with 7e0e38a6724030a859802f03ec37c81b06159d59c41fef5da86f98e5402e8dae not found: ID does not exist" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.519451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.519681 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.519746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.520073 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.534220 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6svgj"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.536742 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.541508 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6svgj"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwqss\" (UniqueName: \"kubernetes.io/projected/44c72978-1e69-4040-b174-cdc1c9ebe222-kube-api-access-dwqss\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-logs\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621073 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621123 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621202 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.621994 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.622463 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-logs\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.628756 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.632493 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.641126 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.649581 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.660059 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwqss\" (UniqueName: \"kubernetes.io/projected/44c72978-1e69-4040-b174-cdc1c9ebe222-kube-api-access-dwqss\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.661488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.680951 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k4xk5"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.698498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: W1014 13:17:05.726390 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd02e469c_24c1_4d5f_af08_d3974395e1a0.slice/crio-f600308babb7fc2f13afc108cdba03a099d4406c6c7816a9cbd3990d20b7b183 WatchSource:0}: Error finding container f600308babb7fc2f13afc108cdba03a099d4406c6c7816a9cbd3990d20b7b183: Status 404 returned error can't find the container with id f600308babb7fc2f13afc108cdba03a099d4406c6c7816a9cbd3990d20b7b183 Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.729668 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.829705 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5j7sn"] Oct 14 13:17:05 crc kubenswrapper[4837]: I1014 13:17:05.852203 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dxj56"] Oct 14 13:17:05 crc kubenswrapper[4837]: W1014 13:17:05.954633 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14d2b3ef_eed6_48cb_948b_3618d6f53fff.slice/crio-ddcdcf894de7560cb758f53f36aa03f36836b12c6aa96a97453b8dd5b210cdc9 WatchSource:0}: Error finding container ddcdcf894de7560cb758f53f36aa03f36836b12c6aa96a97453b8dd5b210cdc9: Status 404 returned error can't find the container with id ddcdcf894de7560cb758f53f36aa03f36836b12c6aa96a97453b8dd5b210cdc9 Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.274745 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4xk5" event={"ID":"d02e469c-24c1-4d5f-af08-d3974395e1a0","Type":"ContainerStarted","Data":"74dc92c73db3394647396a31b6fa40ed645d1a6841924c27c9136d65e22ce508"} Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.274796 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4xk5" event={"ID":"d02e469c-24c1-4d5f-af08-d3974395e1a0","Type":"ContainerStarted","Data":"f600308babb7fc2f13afc108cdba03a099d4406c6c7816a9cbd3990d20b7b183"} Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.276844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5j7sn" event={"ID":"14d2b3ef-eed6-48cb-948b-3618d6f53fff","Type":"ContainerStarted","Data":"ddcdcf894de7560cb758f53f36aa03f36836b12c6aa96a97453b8dd5b210cdc9"} Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.278466 4837 generic.go:334] "Generic (PLEG): container finished" podID="c7de719a-1c3d-44eb-ab8b-7c260977d93c" containerID="de5e2c64018fc72e470599b497fd8d5a04d5cf10a67a90d300284af724c7498e" exitCode=0 Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.278494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" event={"ID":"c7de719a-1c3d-44eb-ab8b-7c260977d93c","Type":"ContainerDied","Data":"de5e2c64018fc72e470599b497fd8d5a04d5cf10a67a90d300284af724c7498e"} Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.278522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" event={"ID":"c7de719a-1c3d-44eb-ab8b-7c260977d93c","Type":"ContainerStarted","Data":"ac62ab138631c1522f355010beba77c3566498681e7d1a1b9c5140ead8c57c01"} Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.298533 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k4xk5" podStartSLOduration=2.298495622 podStartE2EDuration="2.298495622s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:17:06.29185496 +0000 UTC m=+964.208854783" watchObservedRunningTime="2025-10-14 13:17:06.298495622 +0000 UTC m=+964.215495435" Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.333227 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9998548fc-h9brf"] Oct 14 13:17:06 crc kubenswrapper[4837]: W1014 13:17:06.339039 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a06f79e_1f0f_4c64_b7e1_24e0bfc0acee.slice/crio-fb92ae5929388fc4e1bf0a48cdc6d455e8700782a59f143d010d1a22c50e417c WatchSource:0}: Error finding container fb92ae5929388fc4e1bf0a48cdc6d455e8700782a59f143d010d1a22c50e417c: Status 404 returned error can't find the container with id fb92ae5929388fc4e1bf0a48cdc6d455e8700782a59f143d010d1a22c50e417c Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.690893 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mwvdq"] Oct 14 13:17:06 crc kubenswrapper[4837]: W1014 13:17:06.721795 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dc6adfa_9f60_4e67_ba33_98badd63dd5f.slice/crio-27249ad07076b3a7d7f38d610d4378b967197f686830d0a7e27f6c792ccc97e9 WatchSource:0}: Error finding container 27249ad07076b3a7d7f38d610d4378b967197f686830d0a7e27f6c792ccc97e9: Status 404 returned error can't find the container with id 27249ad07076b3a7d7f38d610d4378b967197f686830d0a7e27f6c792ccc97e9 Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.756765 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84646c67f7-4sb8f"] Oct 14 13:17:06 crc kubenswrapper[4837]: W1014 13:17:06.762697 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd417ecbc_b3c6_4550_9d56_f1f13d2ef3bd.slice/crio-fe0c2f37085b5968c6d176c2d4391ce8b3120e8cfb49f49fad6f16896876023b WatchSource:0}: Error finding container fe0c2f37085b5968c6d176c2d4391ce8b3120e8cfb49f49fad6f16896876023b: Status 404 returned error can't find the container with id fe0c2f37085b5968c6d176c2d4391ce8b3120e8cfb49f49fad6f16896876023b Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.775850 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vfpnl"] Oct 14 13:17:06 crc kubenswrapper[4837]: W1014 13:17:06.783601 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d33f0a5_b130_4614_9636_fa0d61fa4e11.slice/crio-bdefe6a3f10b04c1e9bfe9e6260b260001fd7da4fe00d421ef8233c1b02f02a7 WatchSource:0}: Error finding container bdefe6a3f10b04c1e9bfe9e6260b260001fd7da4fe00d421ef8233c1b02f02a7: Status 404 returned error can't find the container with id bdefe6a3f10b04c1e9bfe9e6260b260001fd7da4fe00d421ef8233c1b02f02a7 Oct 14 13:17:06 crc kubenswrapper[4837]: W1014 13:17:06.788274 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58d55e59_7431_474a_a2eb_be646017f3c2.slice/crio-cd93f4de982a838533f170059bfcde8037702b62f379542706a4d58500e7725b WatchSource:0}: Error finding container cd93f4de982a838533f170059bfcde8037702b62f379542706a4d58500e7725b: Status 404 returned error can't find the container with id cd93f4de982a838533f170059bfcde8037702b62f379542706a4d58500e7725b Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.809951 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb9c373-ac68-49f9-876d-6835e623ff5f" path="/var/lib/kubelet/pods/bdb9c373-ac68-49f9-876d-6835e623ff5f/volumes" Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.810882 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jddnh"] Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.811716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b69bq"] Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.813744 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:17:06 crc kubenswrapper[4837]: I1014 13:17:06.953097 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.056606 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24fb6\" (UniqueName: \"kubernetes.io/projected/c7de719a-1c3d-44eb-ab8b-7c260977d93c-kube-api-access-24fb6\") pod \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.061857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-nb\") pod \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.061944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-sb\") pod \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.062029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-svc\") pod \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.062048 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-config\") pod \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.062108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-swift-storage-0\") pod \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\" (UID: \"c7de719a-1c3d-44eb-ab8b-7c260977d93c\") " Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.079764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7de719a-1c3d-44eb-ab8b-7c260977d93c-kube-api-access-24fb6" (OuterVolumeSpecName: "kube-api-access-24fb6") pod "c7de719a-1c3d-44eb-ab8b-7c260977d93c" (UID: "c7de719a-1c3d-44eb-ab8b-7c260977d93c"). InnerVolumeSpecName "kube-api-access-24fb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.094029 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c7de719a-1c3d-44eb-ab8b-7c260977d93c" (UID: "c7de719a-1c3d-44eb-ab8b-7c260977d93c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.096141 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7de719a-1c3d-44eb-ab8b-7c260977d93c" (UID: "c7de719a-1c3d-44eb-ab8b-7c260977d93c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.102910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7de719a-1c3d-44eb-ab8b-7c260977d93c" (UID: "c7de719a-1c3d-44eb-ab8b-7c260977d93c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.128384 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-config" (OuterVolumeSpecName: "config") pod "c7de719a-1c3d-44eb-ab8b-7c260977d93c" (UID: "c7de719a-1c3d-44eb-ab8b-7c260977d93c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.149548 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7de719a-1c3d-44eb-ab8b-7c260977d93c" (UID: "c7de719a-1c3d-44eb-ab8b-7c260977d93c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.159036 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.163910 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24fb6\" (UniqueName: \"kubernetes.io/projected/c7de719a-1c3d-44eb-ab8b-7c260977d93c-kube-api-access-24fb6\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.163955 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.163968 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.163982 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.163993 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.164004 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7de719a-1c3d-44eb-ab8b-7c260977d93c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:07 crc kubenswrapper[4837]: W1014 13:17:07.174888 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9691acb7_d684_4ebe_8e5f_59a97f495a88.slice/crio-af6338c8e1470c2b4686d53910857e5d6df9963122a30163d40b2f6eca84abf8 WatchSource:0}: Error finding container af6338c8e1470c2b4686d53910857e5d6df9963122a30163d40b2f6eca84abf8: Status 404 returned error can't find the container with id af6338c8e1470c2b4686d53910857e5d6df9963122a30163d40b2f6eca84abf8 Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.290712 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9691acb7-d684-4ebe-8e5f-59a97f495a88","Type":"ContainerStarted","Data":"af6338c8e1470c2b4686d53910857e5d6df9963122a30163d40b2f6eca84abf8"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.292995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b69bq" event={"ID":"6d33f0a5-b130-4614-9636-fa0d61fa4e11","Type":"ContainerStarted","Data":"bdefe6a3f10b04c1e9bfe9e6260b260001fd7da4fe00d421ef8233c1b02f02a7"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.295456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d55e59-7431-474a-a2eb-be646017f3c2","Type":"ContainerStarted","Data":"cd93f4de982a838533f170059bfcde8037702b62f379542706a4d58500e7725b"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.302416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jddnh" event={"ID":"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd","Type":"ContainerStarted","Data":"fe0c2f37085b5968c6d176c2d4391ce8b3120e8cfb49f49fad6f16896876023b"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.305729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" event={"ID":"6c469b81-89a4-4d35-bc9a-b04b82c2571e","Type":"ContainerStarted","Data":"dcceead411ebf1fe41ae21b7356ab880d8b7974fa3baa4888d19ea8c330528ec"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.316615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84646c67f7-4sb8f" event={"ID":"e0e6fcb6-6898-40ca-af1d-79a445d128c8","Type":"ContainerStarted","Data":"a92a436da8b48a79231b8df745f377eaec3c5b2a583d8f471467e6748ca1541c"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.318096 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9998548fc-h9brf" event={"ID":"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee","Type":"ContainerStarted","Data":"fb92ae5929388fc4e1bf0a48cdc6d455e8700782a59f143d010d1a22c50e417c"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.325539 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mwvdq" event={"ID":"3dc6adfa-9f60-4e67-ba33-98badd63dd5f","Type":"ContainerStarted","Data":"755881917e433eb82b4d7761844beaad423e0a39296429067ee6da8049a61511"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.325594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mwvdq" event={"ID":"3dc6adfa-9f60-4e67-ba33-98badd63dd5f","Type":"ContainerStarted","Data":"27249ad07076b3a7d7f38d610d4378b967197f686830d0a7e27f6c792ccc97e9"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.330040 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.330022 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-dxj56" event={"ID":"c7de719a-1c3d-44eb-ab8b-7c260977d93c","Type":"ContainerDied","Data":"ac62ab138631c1522f355010beba77c3566498681e7d1a1b9c5140ead8c57c01"} Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.330116 4837 scope.go:117] "RemoveContainer" containerID="de5e2c64018fc72e470599b497fd8d5a04d5cf10a67a90d300284af724c7498e" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.344205 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mwvdq" podStartSLOduration=3.3441851 podStartE2EDuration="3.3441851s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:17:07.338962747 +0000 UTC m=+965.255962580" watchObservedRunningTime="2025-10-14 13:17:07.3441851 +0000 UTC m=+965.261184913" Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.393867 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dxj56"] Oct 14 13:17:07 crc kubenswrapper[4837]: I1014 13:17:07.405920 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dxj56"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.058780 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.195467 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.219035 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84646c67f7-4sb8f"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.233858 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.275885 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-768bbf6757-s92xp"] Oct 14 13:17:08 crc kubenswrapper[4837]: E1014 13:17:08.276345 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7de719a-1c3d-44eb-ab8b-7c260977d93c" containerName="init" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.276364 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7de719a-1c3d-44eb-ab8b-7c260977d93c" containerName="init" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.276596 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7de719a-1c3d-44eb-ab8b-7c260977d93c" containerName="init" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.277873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.333143 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-768bbf6757-s92xp"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.350519 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.388678 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxh8\" (UniqueName: \"kubernetes.io/projected/6ec34006-1792-4006-a01c-9e557626f347-kube-api-access-psxh8\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.388791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ec34006-1792-4006-a01c-9e557626f347-horizon-secret-key\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.388825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-scripts\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.388851 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec34006-1792-4006-a01c-9e557626f347-logs\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.388866 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-config-data\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.409867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9691acb7-d684-4ebe-8e5f-59a97f495a88","Type":"ContainerStarted","Data":"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e"} Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.430480 4837 generic.go:334] "Generic (PLEG): container finished" podID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerID="07f1f4ad139a0ff3fd1e0f168b30f101cfaf9f882f276f26104ace570e865189" exitCode=0 Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.430634 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" event={"ID":"6c469b81-89a4-4d35-bc9a-b04b82c2571e","Type":"ContainerDied","Data":"07f1f4ad139a0ff3fd1e0f168b30f101cfaf9f882f276f26104ace570e865189"} Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.435668 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44c72978-1e69-4040-b174-cdc1c9ebe222","Type":"ContainerStarted","Data":"d30aabcf6a482d6d5198ddab0f4562b98579dcf0089cdc65c6349b28a2e98371"} Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.491000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-scripts\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.491120 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-config-data\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.491275 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec34006-1792-4006-a01c-9e557626f347-logs\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.491402 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxh8\" (UniqueName: \"kubernetes.io/projected/6ec34006-1792-4006-a01c-9e557626f347-kube-api-access-psxh8\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.491889 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec34006-1792-4006-a01c-9e557626f347-logs\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.492101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ec34006-1792-4006-a01c-9e557626f347-horizon-secret-key\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.493126 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-scripts\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.493799 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-config-data\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.509394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ec34006-1792-4006-a01c-9e557626f347-horizon-secret-key\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.529757 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxh8\" (UniqueName: \"kubernetes.io/projected/6ec34006-1792-4006-a01c-9e557626f347-kube-api-access-psxh8\") pod \"horizon-768bbf6757-s92xp\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.662395 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:08 crc kubenswrapper[4837]: I1014 13:17:08.804855 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7de719a-1c3d-44eb-ab8b-7c260977d93c" path="/var/lib/kubelet/pods/c7de719a-1c3d-44eb-ab8b-7c260977d93c/volumes" Oct 14 13:17:09 crc kubenswrapper[4837]: W1014 13:17:09.198654 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ec34006_1792_4006_a01c_9e557626f347.slice/crio-f7e3ff9b24ab65e463acf2c0a21ea33d598f23b325a4a7c458b0b842aa1e481f WatchSource:0}: Error finding container f7e3ff9b24ab65e463acf2c0a21ea33d598f23b325a4a7c458b0b842aa1e481f: Status 404 returned error can't find the container with id f7e3ff9b24ab65e463acf2c0a21ea33d598f23b325a4a7c458b0b842aa1e481f Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.207313 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-768bbf6757-s92xp"] Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.504385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" event={"ID":"6c469b81-89a4-4d35-bc9a-b04b82c2571e","Type":"ContainerStarted","Data":"9d08ef2e1c90280b6a0b75b4ae79f5bd1ce77f65ce9892790425ec06439d51b6"} Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.505922 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.514367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44c72978-1e69-4040-b174-cdc1c9ebe222","Type":"ContainerStarted","Data":"4cb5aea86a7501571bef23d6f7879bbb3d138269b7e38715af88f79f3ca42b36"} Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.529825 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" podStartSLOduration=5.5298085839999995 podStartE2EDuration="5.529808584s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:17:09.522956357 +0000 UTC m=+967.439956190" watchObservedRunningTime="2025-10-14 13:17:09.529808584 +0000 UTC m=+967.446808397" Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.531233 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9691acb7-d684-4ebe-8e5f-59a97f495a88","Type":"ContainerStarted","Data":"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f"} Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.531417 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-log" containerID="cri-o://82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e" gracePeriod=30 Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.531730 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-httpd" containerID="cri-o://d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f" gracePeriod=30 Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.539251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-768bbf6757-s92xp" event={"ID":"6ec34006-1792-4006-a01c-9e557626f347","Type":"ContainerStarted","Data":"f7e3ff9b24ab65e463acf2c0a21ea33d598f23b325a4a7c458b0b842aa1e481f"} Oct 14 13:17:09 crc kubenswrapper[4837]: I1014 13:17:09.552091 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.552035351 podStartE2EDuration="5.552035351s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:17:09.549600325 +0000 UTC m=+967.466600148" watchObservedRunningTime="2025-10-14 13:17:09.552035351 +0000 UTC m=+967.469035164" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.371244 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.439843 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-scripts\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.439899 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4n7p\" (UniqueName: \"kubernetes.io/projected/9691acb7-d684-4ebe-8e5f-59a97f495a88-kube-api-access-g4n7p\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.439929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-logs\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.439985 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-config-data\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.440049 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.440101 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-public-tls-certs\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.440123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-httpd-run\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.440209 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-combined-ca-bundle\") pod \"9691acb7-d684-4ebe-8e5f-59a97f495a88\" (UID: \"9691acb7-d684-4ebe-8e5f-59a97f495a88\") " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.459311 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-logs" (OuterVolumeSpecName: "logs") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.459603 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-scripts" (OuterVolumeSpecName: "scripts") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.459951 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.465670 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9691acb7-d684-4ebe-8e5f-59a97f495a88-kube-api-access-g4n7p" (OuterVolumeSpecName: "kube-api-access-g4n7p") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "kube-api-access-g4n7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.472485 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.491341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.507369 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.512567 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-config-data" (OuterVolumeSpecName: "config-data") pod "9691acb7-d684-4ebe-8e5f-59a97f495a88" (UID: "9691acb7-d684-4ebe-8e5f-59a97f495a88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542321 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542359 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542370 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4n7p\" (UniqueName: \"kubernetes.io/projected/9691acb7-d684-4ebe-8e5f-59a97f495a88-kube-api-access-g4n7p\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542382 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542393 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542429 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542441 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9691acb7-d684-4ebe-8e5f-59a97f495a88-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.542451 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9691acb7-d684-4ebe-8e5f-59a97f495a88-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.553262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44c72978-1e69-4040-b174-cdc1c9ebe222","Type":"ContainerStarted","Data":"921ebb64ed5624fc5cf0cf904225f1d951ecda123570c3773fd281d7ae0a77f3"} Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.553342 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-log" containerID="cri-o://4cb5aea86a7501571bef23d6f7879bbb3d138269b7e38715af88f79f3ca42b36" gracePeriod=30 Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.553389 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-httpd" containerID="cri-o://921ebb64ed5624fc5cf0cf904225f1d951ecda123570c3773fd281d7ae0a77f3" gracePeriod=30 Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.560343 4837 generic.go:334] "Generic (PLEG): container finished" podID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerID="d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f" exitCode=0 Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.560505 4837 generic.go:334] "Generic (PLEG): container finished" podID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerID="82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e" exitCode=143 Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.561962 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.562015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9691acb7-d684-4ebe-8e5f-59a97f495a88","Type":"ContainerDied","Data":"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f"} Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.562145 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9691acb7-d684-4ebe-8e5f-59a97f495a88","Type":"ContainerDied","Data":"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e"} Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.562177 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9691acb7-d684-4ebe-8e5f-59a97f495a88","Type":"ContainerDied","Data":"af6338c8e1470c2b4686d53910857e5d6df9963122a30163d40b2f6eca84abf8"} Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.562201 4837 scope.go:117] "RemoveContainer" containerID="d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.569726 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.593774 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.593742401 podStartE2EDuration="5.593742401s" podCreationTimestamp="2025-10-14 13:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:17:10.578082553 +0000 UTC m=+968.495082396" watchObservedRunningTime="2025-10-14 13:17:10.593742401 +0000 UTC m=+968.510742214" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.610103 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.615685 4837 scope.go:117] "RemoveContainer" containerID="82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.619854 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.648271 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.658036 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:10 crc kubenswrapper[4837]: E1014 13:17:10.658482 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-httpd" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.658509 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-httpd" Oct 14 13:17:10 crc kubenswrapper[4837]: E1014 13:17:10.658549 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-log" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.658559 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-log" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.659045 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-httpd" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.659085 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" containerName="glance-log" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.660029 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.662614 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.667567 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.684382 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.701211 4837 scope.go:117] "RemoveContainer" containerID="d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f" Oct 14 13:17:10 crc kubenswrapper[4837]: E1014 13:17:10.704373 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f\": container with ID starting with d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f not found: ID does not exist" containerID="d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.704457 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f"} err="failed to get container status \"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f\": rpc error: code = NotFound desc = could not find container \"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f\": container with ID starting with d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f not found: ID does not exist" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.704493 4837 scope.go:117] "RemoveContainer" containerID="82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e" Oct 14 13:17:10 crc kubenswrapper[4837]: E1014 13:17:10.710602 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e\": container with ID starting with 82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e not found: ID does not exist" containerID="82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.710659 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e"} err="failed to get container status \"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e\": rpc error: code = NotFound desc = could not find container \"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e\": container with ID starting with 82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e not found: ID does not exist" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.710692 4837 scope.go:117] "RemoveContainer" containerID="d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.711178 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f"} err="failed to get container status \"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f\": rpc error: code = NotFound desc = could not find container \"d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f\": container with ID starting with d210fec8adaeb3e2e4660311cb81e60e1cd28c20ce7a63e2061cb694d58b4e8f not found: ID does not exist" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.711205 4837 scope.go:117] "RemoveContainer" containerID="82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.711898 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e"} err="failed to get container status \"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e\": rpc error: code = NotFound desc = could not find container \"82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e\": container with ID starting with 82f8272efef1d0087b8abea22c06f34f30bd38c2fea8d5f5fa66cc3d729b834e not found: ID does not exist" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.753046 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.753282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-logs\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.753395 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.753786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.753917 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-scripts\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.754367 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzvrq\" (UniqueName: \"kubernetes.io/projected/77623217-820b-4085-890a-d8afa93925f6-kube-api-access-mzvrq\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.754473 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-config-data\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.754524 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.813274 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9691acb7-d684-4ebe-8e5f-59a97f495a88" path="/var/lib/kubelet/pods/9691acb7-d684-4ebe-8e5f-59a97f495a88/volumes" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.856920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-logs\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-scripts\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzvrq\" (UniqueName: \"kubernetes.io/projected/77623217-820b-4085-890a-d8afa93925f6-kube-api-access-mzvrq\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857298 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-config-data\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857572 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857866 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.857971 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-logs\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.864983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.865191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-scripts\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.873242 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.874691 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-config-data\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.899104 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzvrq\" (UniqueName: \"kubernetes.io/projected/77623217-820b-4085-890a-d8afa93925f6-kube-api-access-mzvrq\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.906127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " pod="openstack/glance-default-external-api-0" Oct 14 13:17:10 crc kubenswrapper[4837]: I1014 13:17:10.996066 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:17:11 crc kubenswrapper[4837]: I1014 13:17:11.575485 4837 generic.go:334] "Generic (PLEG): container finished" podID="d02e469c-24c1-4d5f-af08-d3974395e1a0" containerID="74dc92c73db3394647396a31b6fa40ed645d1a6841924c27c9136d65e22ce508" exitCode=0 Oct 14 13:17:11 crc kubenswrapper[4837]: I1014 13:17:11.575604 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4xk5" event={"ID":"d02e469c-24c1-4d5f-af08-d3974395e1a0","Type":"ContainerDied","Data":"74dc92c73db3394647396a31b6fa40ed645d1a6841924c27c9136d65e22ce508"} Oct 14 13:17:11 crc kubenswrapper[4837]: I1014 13:17:11.581685 4837 generic.go:334] "Generic (PLEG): container finished" podID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerID="921ebb64ed5624fc5cf0cf904225f1d951ecda123570c3773fd281d7ae0a77f3" exitCode=0 Oct 14 13:17:11 crc kubenswrapper[4837]: I1014 13:17:11.581709 4837 generic.go:334] "Generic (PLEG): container finished" podID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerID="4cb5aea86a7501571bef23d6f7879bbb3d138269b7e38715af88f79f3ca42b36" exitCode=143 Oct 14 13:17:11 crc kubenswrapper[4837]: I1014 13:17:11.581758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44c72978-1e69-4040-b174-cdc1c9ebe222","Type":"ContainerDied","Data":"921ebb64ed5624fc5cf0cf904225f1d951ecda123570c3773fd281d7ae0a77f3"} Oct 14 13:17:11 crc kubenswrapper[4837]: I1014 13:17:11.581786 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44c72978-1e69-4040-b174-cdc1c9ebe222","Type":"ContainerDied","Data":"4cb5aea86a7501571bef23d6f7879bbb3d138269b7e38715af88f79f3ca42b36"} Oct 14 13:17:13 crc kubenswrapper[4837]: I1014 13:17:13.883086 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9998548fc-h9brf"] Oct 14 13:17:13 crc kubenswrapper[4837]: I1014 13:17:13.922590 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6655497d8d-h2r8r"] Oct 14 13:17:13 crc kubenswrapper[4837]: I1014 13:17:13.924003 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:13 crc kubenswrapper[4837]: I1014 13:17:13.926402 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 14 13:17:13 crc kubenswrapper[4837]: I1014 13:17:13.933609 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:17:13 crc kubenswrapper[4837]: I1014 13:17:13.957369 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6655497d8d-h2r8r"] Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.027775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-combined-ca-bundle\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.028064 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-logs\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.028121 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-scripts\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.028185 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-config-data\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.028273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-tls-certs\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.028369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-secret-key\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.028460 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjtth\" (UniqueName: \"kubernetes.io/projected/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-kube-api-access-mjtth\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.039883 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-768bbf6757-s92xp"] Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.088854 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b48ff9644-mb62f"] Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.090750 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-horizon-secret-key\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130346 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-combined-ca-bundle\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130368 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d3a61c6-2a73-409f-b296-10f7a19685d6-scripts\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130394 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d3a61c6-2a73-409f-b296-10f7a19685d6-config-data\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nckg\" (UniqueName: \"kubernetes.io/projected/0d3a61c6-2a73-409f-b296-10f7a19685d6-kube-api-access-2nckg\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130744 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-horizon-tls-certs\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-combined-ca-bundle\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-logs\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.130935 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-scripts\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.131011 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3a61c6-2a73-409f-b296-10f7a19685d6-logs\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.131045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-config-data\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.131069 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-tls-certs\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.131112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-secret-key\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.131200 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjtth\" (UniqueName: \"kubernetes.io/projected/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-kube-api-access-mjtth\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.132069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-logs\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.132951 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-scripts\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.134063 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-config-data\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.138636 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-secret-key\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.140938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-combined-ca-bundle\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.151771 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-tls-certs\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.160192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjtth\" (UniqueName: \"kubernetes.io/projected/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-kube-api-access-mjtth\") pod \"horizon-6655497d8d-h2r8r\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.185665 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b48ff9644-mb62f"] Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-horizon-secret-key\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-combined-ca-bundle\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d3a61c6-2a73-409f-b296-10f7a19685d6-scripts\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232679 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d3a61c6-2a73-409f-b296-10f7a19685d6-config-data\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232765 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nckg\" (UniqueName: \"kubernetes.io/projected/0d3a61c6-2a73-409f-b296-10f7a19685d6-kube-api-access-2nckg\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232797 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-horizon-tls-certs\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.232871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3a61c6-2a73-409f-b296-10f7a19685d6-logs\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.233455 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3a61c6-2a73-409f-b296-10f7a19685d6-logs\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.235096 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d3a61c6-2a73-409f-b296-10f7a19685d6-scripts\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.235858 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d3a61c6-2a73-409f-b296-10f7a19685d6-config-data\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.238688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-horizon-secret-key\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.240734 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-horizon-tls-certs\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.242286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3a61c6-2a73-409f-b296-10f7a19685d6-combined-ca-bundle\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.250003 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.253643 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nckg\" (UniqueName: \"kubernetes.io/projected/0d3a61c6-2a73-409f-b296-10f7a19685d6-kube-api-access-2nckg\") pod \"horizon-b48ff9644-mb62f\" (UID: \"0d3a61c6-2a73-409f-b296-10f7a19685d6\") " pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:14 crc kubenswrapper[4837]: I1014 13:17:14.409436 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:17:15 crc kubenswrapper[4837]: I1014 13:17:15.522422 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:17:15 crc kubenswrapper[4837]: I1014 13:17:15.586500 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-cntf6"] Oct 14 13:17:15 crc kubenswrapper[4837]: I1014 13:17:15.586803 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" containerID="cri-o://644a9b4c376b82629ca8c5239b44064a1e85def5436830e1785a2d4ea653b902" gracePeriod=10 Oct 14 13:17:17 crc kubenswrapper[4837]: I1014 13:17:17.665553 4837 generic.go:334] "Generic (PLEG): container finished" podID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerID="644a9b4c376b82629ca8c5239b44064a1e85def5436830e1785a2d4ea653b902" exitCode=0 Oct 14 13:17:17 crc kubenswrapper[4837]: I1014 13:17:17.665650 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" event={"ID":"2188251b-90b0-46fe-9bb2-5b16cd3d1dac","Type":"ContainerDied","Data":"644a9b4c376b82629ca8c5239b44064a1e85def5436830e1785a2d4ea653b902"} Oct 14 13:17:19 crc kubenswrapper[4837]: I1014 13:17:19.077518 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.594081 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.599363 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.706959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44c72978-1e69-4040-b174-cdc1c9ebe222","Type":"ContainerDied","Data":"d30aabcf6a482d6d5198ddab0f4562b98579dcf0089cdc65c6349b28a2e98371"} Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.707005 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.707048 4837 scope.go:117] "RemoveContainer" containerID="921ebb64ed5624fc5cf0cf904225f1d951ecda123570c3773fd281d7ae0a77f3" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.709867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4xk5" event={"ID":"d02e469c-24c1-4d5f-af08-d3974395e1a0","Type":"ContainerDied","Data":"f600308babb7fc2f13afc108cdba03a099d4406c6c7816a9cbd3990d20b7b183"} Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.709911 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f600308babb7fc2f13afc108cdba03a099d4406c6c7816a9cbd3990d20b7b183" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.709944 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4xk5" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.722827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwqss\" (UniqueName: \"kubernetes.io/projected/44c72978-1e69-4040-b174-cdc1c9ebe222-kube-api-access-dwqss\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.722902 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-config-data\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723004 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-config-data\") pod \"d02e469c-24c1-4d5f-af08-d3974395e1a0\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723062 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-logs\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-scripts\") pod \"d02e469c-24c1-4d5f-af08-d3974395e1a0\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723132 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htzkf\" (UniqueName: \"kubernetes.io/projected/d02e469c-24c1-4d5f-af08-d3974395e1a0-kube-api-access-htzkf\") pod \"d02e469c-24c1-4d5f-af08-d3974395e1a0\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-httpd-run\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723212 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-credential-keys\") pod \"d02e469c-24c1-4d5f-af08-d3974395e1a0\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723243 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-combined-ca-bundle\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723278 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-internal-tls-certs\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-combined-ca-bundle\") pod \"d02e469c-24c1-4d5f-af08-d3974395e1a0\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723397 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-scripts\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723459 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"44c72978-1e69-4040-b174-cdc1c9ebe222\" (UID: \"44c72978-1e69-4040-b174-cdc1c9ebe222\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723485 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-fernet-keys\") pod \"d02e469c-24c1-4d5f-af08-d3974395e1a0\" (UID: \"d02e469c-24c1-4d5f-af08-d3974395e1a0\") " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.723748 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-logs" (OuterVolumeSpecName: "logs") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.724495 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.730765 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.732213 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d02e469c-24c1-4d5f-af08-d3974395e1a0" (UID: "d02e469c-24c1-4d5f-af08-d3974395e1a0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.732292 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d02e469c-24c1-4d5f-af08-d3974395e1a0" (UID: "d02e469c-24c1-4d5f-af08-d3974395e1a0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.732322 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-scripts" (OuterVolumeSpecName: "scripts") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.732376 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.732606 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02e469c-24c1-4d5f-af08-d3974395e1a0-kube-api-access-htzkf" (OuterVolumeSpecName: "kube-api-access-htzkf") pod "d02e469c-24c1-4d5f-af08-d3974395e1a0" (UID: "d02e469c-24c1-4d5f-af08-d3974395e1a0"). InnerVolumeSpecName "kube-api-access-htzkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.735095 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c72978-1e69-4040-b174-cdc1c9ebe222-kube-api-access-dwqss" (OuterVolumeSpecName: "kube-api-access-dwqss") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "kube-api-access-dwqss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.738777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-scripts" (OuterVolumeSpecName: "scripts") pod "d02e469c-24c1-4d5f-af08-d3974395e1a0" (UID: "d02e469c-24c1-4d5f-af08-d3974395e1a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.757259 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.769748 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d02e469c-24c1-4d5f-af08-d3974395e1a0" (UID: "d02e469c-24c1-4d5f-af08-d3974395e1a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.779180 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-config-data" (OuterVolumeSpecName: "config-data") pod "d02e469c-24c1-4d5f-af08-d3974395e1a0" (UID: "d02e469c-24c1-4d5f-af08-d3974395e1a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.793895 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-config-data" (OuterVolumeSpecName: "config-data") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.805464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44c72978-1e69-4040-b174-cdc1c9ebe222" (UID: "44c72978-1e69-4040-b174-cdc1c9ebe222"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825849 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825904 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htzkf\" (UniqueName: \"kubernetes.io/projected/d02e469c-24c1-4d5f-af08-d3974395e1a0-kube-api-access-htzkf\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825918 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825929 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44c72978-1e69-4040-b174-cdc1c9ebe222-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825941 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825953 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825964 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825975 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.825986 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.826028 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.826040 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d02e469c-24c1-4d5f-af08-d3974395e1a0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.826051 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwqss\" (UniqueName: \"kubernetes.io/projected/44c72978-1e69-4040-b174-cdc1c9ebe222-kube-api-access-dwqss\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.826062 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c72978-1e69-4040-b174-cdc1c9ebe222-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.853732 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 14 13:17:21 crc kubenswrapper[4837]: I1014 13:17:21.927540 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.062767 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.074064 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.082342 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:22 crc kubenswrapper[4837]: E1014 13:17:22.082812 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-httpd" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.082837 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-httpd" Oct 14 13:17:22 crc kubenswrapper[4837]: E1014 13:17:22.082870 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-log" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.082879 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-log" Oct 14 13:17:22 crc kubenswrapper[4837]: E1014 13:17:22.082896 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02e469c-24c1-4d5f-af08-d3974395e1a0" containerName="keystone-bootstrap" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.082904 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02e469c-24c1-4d5f-af08-d3974395e1a0" containerName="keystone-bootstrap" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.083132 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02e469c-24c1-4d5f-af08-d3974395e1a0" containerName="keystone-bootstrap" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.083179 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-httpd" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.083194 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" containerName="glance-log" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.084321 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.087073 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.088072 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.101025 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235213 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235469 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235534 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n455k\" (UniqueName: \"kubernetes.io/projected/e3b427fc-538b-4823-8ef3-8bab1765faee-kube-api-access-n455k\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.235686 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.336903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.336976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n455k\" (UniqueName: \"kubernetes.io/projected/e3b427fc-538b-4823-8ef3-8bab1765faee-kube-api-access-n455k\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337040 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337068 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337123 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337198 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337650 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.337813 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.339857 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.343972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.345710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.383310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.386471 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n455k\" (UniqueName: \"kubernetes.io/projected/e3b427fc-538b-4823-8ef3-8bab1765faee-kube-api-access-n455k\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.389891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.432081 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.713926 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.728198 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k4xk5"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.735456 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k4xk5"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.815973 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c72978-1e69-4040-b174-cdc1c9ebe222" path="/var/lib/kubelet/pods/44c72978-1e69-4040-b174-cdc1c9ebe222/volumes" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.819316 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02e469c-24c1-4d5f-af08-d3974395e1a0" path="/var/lib/kubelet/pods/d02e469c-24c1-4d5f-af08-d3974395e1a0/volumes" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.819893 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7h69"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.821922 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.824642 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.824952 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6z55" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.825152 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.825912 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.828285 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7h69"] Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.947455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-combined-ca-bundle\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.947565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-scripts\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.947645 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-config-data\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.947723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-credential-keys\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.947767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-fernet-keys\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:22 crc kubenswrapper[4837]: I1014 13:17:22.947795 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shlx9\" (UniqueName: \"kubernetes.io/projected/2646558f-772d-41e3-8079-ae80e140a23a-kube-api-access-shlx9\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.048978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-scripts\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.049057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-config-data\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.049096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-credential-keys\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.049127 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-fernet-keys\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.049146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shlx9\" (UniqueName: \"kubernetes.io/projected/2646558f-772d-41e3-8079-ae80e140a23a-kube-api-access-shlx9\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.049211 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-combined-ca-bundle\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.052960 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-combined-ca-bundle\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.053500 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-scripts\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.053678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-credential-keys\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.058548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-config-data\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.065982 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shlx9\" (UniqueName: \"kubernetes.io/projected/2646558f-772d-41e3-8079-ae80e140a23a-kube-api-access-shlx9\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.067328 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-fernet-keys\") pod \"keystone-bootstrap-s7h69\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:23 crc kubenswrapper[4837]: I1014 13:17:23.199847 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:17:24 crc kubenswrapper[4837]: I1014 13:17:24.078149 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.029977 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.030693 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n579h694h655hc4h697h77h575h5fbh89hbfh55fh64dh68ch64dh6bh59dh7dh54ch54dh7dh6dhc6h595h68dh566h66h5c6h5cch77h5h679hd5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mvm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-9998548fc-h9brf_openstack(9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.033677 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-9998548fc-h9brf" podUID="9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.417125 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.417432 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n698h6bhffh656h645h667hb8h55ch576h87h567h5f9h556h68hb5h6hcdh5b5h567h55dh54fh87h645h657h99hcfh68ch9dh85h579h5ddh674q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6p8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-84646c67f7-4sb8f_openstack(e0e6fcb6-6898-40ca-af1d-79a445d128c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.420620 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-84646c67f7-4sb8f" podUID="e0e6fcb6-6898-40ca-af1d-79a445d128c8" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.778738 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.779540 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b9hdfh675hbfh9bh5f9h5b4h666hbbh5c6h5fch68fhc7h66ch5cbh57h589h5bfhf4h7fh58fhch66bh54fh5fh84h55ch684h95hb4h65bh68dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psxh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-768bbf6757-s92xp_openstack(6ec34006-1792-4006-a01c-9e557626f347): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.783255 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-768bbf6757-s92xp" podUID="6ec34006-1792-4006-a01c-9e557626f347" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.906069 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 14 13:17:32 crc kubenswrapper[4837]: E1014 13:17:32.906605 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n597h57ch696h5ddh567h8ch7fhfbhfch679h5f6h5b7h648hddhc7hffh5c4h679h559hd7h659h87h76h554h67ch5f7hf6hdbh58bh58chc4h696q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wc76j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(58d55e59-7431-474a-a2eb-be646017f3c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:17:34 crc kubenswrapper[4837]: I1014 13:17:34.077538 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:17:34 crc kubenswrapper[4837]: I1014 13:17:34.077684 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:17:39 crc kubenswrapper[4837]: I1014 13:17:39.079333 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.518322 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.524692 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696596 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ec34006-1792-4006-a01c-9e557626f347-horizon-secret-key\") pod \"6ec34006-1792-4006-a01c-9e557626f347\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696653 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-scripts\") pod \"6ec34006-1792-4006-a01c-9e557626f347\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxh8\" (UniqueName: \"kubernetes.io/projected/6ec34006-1792-4006-a01c-9e557626f347-kube-api-access-psxh8\") pod \"6ec34006-1792-4006-a01c-9e557626f347\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696763 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec34006-1792-4006-a01c-9e557626f347-logs\") pod \"6ec34006-1792-4006-a01c-9e557626f347\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-logs\") pod \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696829 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-config-data\") pod \"6ec34006-1792-4006-a01c-9e557626f347\" (UID: \"6ec34006-1792-4006-a01c-9e557626f347\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-scripts\") pod \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696972 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-horizon-secret-key\") pod \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.696998 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-config-data\") pod \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.697020 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mvm4\" (UniqueName: \"kubernetes.io/projected/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-kube-api-access-9mvm4\") pod \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\" (UID: \"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee\") " Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.697571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-logs" (OuterVolumeSpecName: "logs") pod "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" (UID: "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.697977 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec34006-1792-4006-a01c-9e557626f347-logs" (OuterVolumeSpecName: "logs") pod "6ec34006-1792-4006-a01c-9e557626f347" (UID: "6ec34006-1792-4006-a01c-9e557626f347"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.698008 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-scripts" (OuterVolumeSpecName: "scripts") pod "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" (UID: "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.698651 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-scripts" (OuterVolumeSpecName: "scripts") pod "6ec34006-1792-4006-a01c-9e557626f347" (UID: "6ec34006-1792-4006-a01c-9e557626f347"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.698955 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-config-data" (OuterVolumeSpecName: "config-data") pod "6ec34006-1792-4006-a01c-9e557626f347" (UID: "6ec34006-1792-4006-a01c-9e557626f347"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.698972 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-config-data" (OuterVolumeSpecName: "config-data") pod "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" (UID: "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.704142 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec34006-1792-4006-a01c-9e557626f347-kube-api-access-psxh8" (OuterVolumeSpecName: "kube-api-access-psxh8") pod "6ec34006-1792-4006-a01c-9e557626f347" (UID: "6ec34006-1792-4006-a01c-9e557626f347"). InnerVolumeSpecName "kube-api-access-psxh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.705035 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec34006-1792-4006-a01c-9e557626f347-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6ec34006-1792-4006-a01c-9e557626f347" (UID: "6ec34006-1792-4006-a01c-9e557626f347"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.705179 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-kube-api-access-9mvm4" (OuterVolumeSpecName: "kube-api-access-9mvm4") pod "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" (UID: "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee"). InnerVolumeSpecName "kube-api-access-9mvm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.705210 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" (UID: "9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799353 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxh8\" (UniqueName: \"kubernetes.io/projected/6ec34006-1792-4006-a01c-9e557626f347-kube-api-access-psxh8\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799425 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec34006-1792-4006-a01c-9e557626f347-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799443 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799460 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799478 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799493 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799508 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799522 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mvm4\" (UniqueName: \"kubernetes.io/projected/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee-kube-api-access-9mvm4\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799537 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ec34006-1792-4006-a01c-9e557626f347-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.799553 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ec34006-1792-4006-a01c-9e557626f347-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.894513 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-768bbf6757-s92xp" event={"ID":"6ec34006-1792-4006-a01c-9e557626f347","Type":"ContainerDied","Data":"f7e3ff9b24ab65e463acf2c0a21ea33d598f23b325a4a7c458b0b842aa1e481f"} Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.894530 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bbf6757-s92xp" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.895680 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9998548fc-h9brf" event={"ID":"9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee","Type":"ContainerDied","Data":"fb92ae5929388fc4e1bf0a48cdc6d455e8700782a59f143d010d1a22c50e417c"} Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.895770 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9998548fc-h9brf" Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.948837 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-768bbf6757-s92xp"] Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.965928 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-768bbf6757-s92xp"] Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.987340 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9998548fc-h9brf"] Oct 14 13:17:42 crc kubenswrapper[4837]: I1014 13:17:42.997420 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9998548fc-h9brf"] Oct 14 13:17:44 crc kubenswrapper[4837]: I1014 13:17:44.080236 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:17:44 crc kubenswrapper[4837]: I1014 13:17:44.795563 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec34006-1792-4006-a01c-9e557626f347" path="/var/lib/kubelet/pods/6ec34006-1792-4006-a01c-9e557626f347/volumes" Oct 14 13:17:44 crc kubenswrapper[4837]: I1014 13:17:44.796009 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee" path="/var/lib/kubelet/pods/9a06f79e-1f0f-4c64-b7e1-24e0bfc0acee/volumes" Oct 14 13:17:49 crc kubenswrapper[4837]: I1014 13:17:49.080908 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:17:54 crc kubenswrapper[4837]: I1014 13:17:54.081878 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:17:59 crc kubenswrapper[4837]: I1014 13:17:59.083151 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:18:01 crc kubenswrapper[4837]: E1014 13:18:01.927816 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2398971497/1\": happened during read: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Oct 14 13:18:01 crc kubenswrapper[4837]: E1014 13:18:01.928108 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n597h57ch696h5ddh567h8ch7fhfbhfch679h5f6h5b7h648hddhc7hffh5c4h679h559hd7h659h87h76h554h67ch5f7hf6hdbh58bh58chc4h696q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wc76j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(58d55e59-7431-474a-a2eb-be646017f3c2): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage2398971497/1\": happened during read: context canceled" logger="UnhandledError" Oct 14 13:18:01 crc kubenswrapper[4837]: I1014 13:18:01.985927 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:18:01 crc kubenswrapper[4837]: I1014 13:18:01.993406 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.054522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" event={"ID":"2188251b-90b0-46fe-9bb2-5b16cd3d1dac","Type":"ContainerDied","Data":"f0790595b0d8578b063d709db9a141bead75efb47096618555a636ce11f155d2"} Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.054535 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.056035 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84646c67f7-4sb8f" event={"ID":"e0e6fcb6-6898-40ca-af1d-79a445d128c8","Type":"ContainerDied","Data":"a92a436da8b48a79231b8df745f377eaec3c5b2a583d8f471467e6748ca1541c"} Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.056068 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84646c67f7-4sb8f" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6p8t\" (UniqueName: \"kubernetes.io/projected/e0e6fcb6-6898-40ca-af1d-79a445d128c8-kube-api-access-t6p8t\") pod \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084293 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e6fcb6-6898-40ca-af1d-79a445d128c8-logs\") pod \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084312 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-sb\") pod \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084346 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-nb\") pod \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084382 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74qqt\" (UniqueName: \"kubernetes.io/projected/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-kube-api-access-74qqt\") pod \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-scripts\") pod \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-config-data\") pod \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084543 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-swift-storage-0\") pod \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0e6fcb6-6898-40ca-af1d-79a445d128c8-horizon-secret-key\") pod \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\" (UID: \"e0e6fcb6-6898-40ca-af1d-79a445d128c8\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-svc\") pod \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.084635 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-config\") pod \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\" (UID: \"2188251b-90b0-46fe-9bb2-5b16cd3d1dac\") " Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.085535 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e6fcb6-6898-40ca-af1d-79a445d128c8-logs" (OuterVolumeSpecName: "logs") pod "e0e6fcb6-6898-40ca-af1d-79a445d128c8" (UID: "e0e6fcb6-6898-40ca-af1d-79a445d128c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.085799 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-config-data" (OuterVolumeSpecName: "config-data") pod "e0e6fcb6-6898-40ca-af1d-79a445d128c8" (UID: "e0e6fcb6-6898-40ca-af1d-79a445d128c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.085871 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-scripts" (OuterVolumeSpecName: "scripts") pod "e0e6fcb6-6898-40ca-af1d-79a445d128c8" (UID: "e0e6fcb6-6898-40ca-af1d-79a445d128c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.090591 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-kube-api-access-74qqt" (OuterVolumeSpecName: "kube-api-access-74qqt") pod "2188251b-90b0-46fe-9bb2-5b16cd3d1dac" (UID: "2188251b-90b0-46fe-9bb2-5b16cd3d1dac"). InnerVolumeSpecName "kube-api-access-74qqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.125444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2188251b-90b0-46fe-9bb2-5b16cd3d1dac" (UID: "2188251b-90b0-46fe-9bb2-5b16cd3d1dac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.129393 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-config" (OuterVolumeSpecName: "config") pod "2188251b-90b0-46fe-9bb2-5b16cd3d1dac" (UID: "2188251b-90b0-46fe-9bb2-5b16cd3d1dac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.130342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2188251b-90b0-46fe-9bb2-5b16cd3d1dac" (UID: "2188251b-90b0-46fe-9bb2-5b16cd3d1dac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.135496 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2188251b-90b0-46fe-9bb2-5b16cd3d1dac" (UID: "2188251b-90b0-46fe-9bb2-5b16cd3d1dac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.142131 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2188251b-90b0-46fe-9bb2-5b16cd3d1dac" (UID: "2188251b-90b0-46fe-9bb2-5b16cd3d1dac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187025 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187064 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e6fcb6-6898-40ca-af1d-79a445d128c8-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187076 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187088 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187099 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187109 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e6fcb6-6898-40ca-af1d-79a445d128c8-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187119 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187128 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.187136 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74qqt\" (UniqueName: \"kubernetes.io/projected/2188251b-90b0-46fe-9bb2-5b16cd3d1dac-kube-api-access-74qqt\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.277610 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e6fcb6-6898-40ca-af1d-79a445d128c8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e0e6fcb6-6898-40ca-af1d-79a445d128c8" (UID: "e0e6fcb6-6898-40ca-af1d-79a445d128c8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.288634 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0e6fcb6-6898-40ca-af1d-79a445d128c8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.411199 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e6fcb6-6898-40ca-af1d-79a445d128c8-kube-api-access-t6p8t" (OuterVolumeSpecName: "kube-api-access-t6p8t") pod "e0e6fcb6-6898-40ca-af1d-79a445d128c8" (UID: "e0e6fcb6-6898-40ca-af1d-79a445d128c8"). InnerVolumeSpecName "kube-api-access-t6p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.414152 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-cntf6"] Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.422866 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-cntf6"] Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.492694 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6p8t\" (UniqueName: \"kubernetes.io/projected/e0e6fcb6-6898-40ca-af1d-79a445d128c8-kube-api-access-t6p8t\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:02 crc kubenswrapper[4837]: E1014 13:18:02.614931 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 14 13:18:02 crc kubenswrapper[4837]: E1014 13:18:02.615134 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kz99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-b69bq_openstack(6d33f0a5-b130-4614-9636-fa0d61fa4e11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:18:02 crc kubenswrapper[4837]: E1014 13:18:02.616519 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-b69bq" podUID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.723756 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84646c67f7-4sb8f"] Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.732877 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84646c67f7-4sb8f"] Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.794232 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" path="/var/lib/kubelet/pods/2188251b-90b0-46fe-9bb2-5b16cd3d1dac/volumes" Oct 14 13:18:02 crc kubenswrapper[4837]: I1014 13:18:02.795318 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e6fcb6-6898-40ca-af1d-79a445d128c8" path="/var/lib/kubelet/pods/e0e6fcb6-6898-40ca-af1d-79a445d128c8/volumes" Oct 14 13:18:03 crc kubenswrapper[4837]: E1014 13:18:03.066213 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-b69bq" podUID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" Oct 14 13:18:03 crc kubenswrapper[4837]: I1014 13:18:03.682341 4837 scope.go:117] "RemoveContainer" containerID="4cb5aea86a7501571bef23d6f7879bbb3d138269b7e38715af88f79f3ca42b36" Oct 14 13:18:03 crc kubenswrapper[4837]: I1014 13:18:03.878261 4837 scope.go:117] "RemoveContainer" containerID="644a9b4c376b82629ca8c5239b44064a1e85def5436830e1785a2d4ea653b902" Oct 14 13:18:03 crc kubenswrapper[4837]: I1014 13:18:03.923136 4837 scope.go:117] "RemoveContainer" containerID="2298cf1400e8ce4833997890ed56d35a017ddf8819730c49a57ddaa49e737262" Oct 14 13:18:03 crc kubenswrapper[4837]: E1014 13:18:03.974128 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 14 13:18:03 crc kubenswrapper[4837]: E1014 13:18:03.974308 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7vcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-5j7sn_openstack(14d2b3ef-eed6-48cb-948b-3618d6f53fff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:18:03 crc kubenswrapper[4837]: E1014 13:18:03.975863 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-5j7sn" podUID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" Oct 14 13:18:04 crc kubenswrapper[4837]: E1014 13:18:04.074268 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-5j7sn" podUID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" Oct 14 13:18:04 crc kubenswrapper[4837]: I1014 13:18:04.086055 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-895cf5cf-cntf6" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Oct 14 13:18:04 crc kubenswrapper[4837]: I1014 13:18:04.299129 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7h69"] Oct 14 13:18:04 crc kubenswrapper[4837]: I1014 13:18:04.305775 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b48ff9644-mb62f"] Oct 14 13:18:04 crc kubenswrapper[4837]: I1014 13:18:04.428865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6655497d8d-h2r8r"] Oct 14 13:18:04 crc kubenswrapper[4837]: W1014 13:18:04.429950 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2372174a_82ac_4421_a4e7_8ffcd7b4e92f.slice/crio-5beb8d9c77a85be189ef4cdb1f7c88ef94c285be00447b49a472d62ee70c4b74 WatchSource:0}: Error finding container 5beb8d9c77a85be189ef4cdb1f7c88ef94c285be00447b49a472d62ee70c4b74: Status 404 returned error can't find the container with id 5beb8d9c77a85be189ef4cdb1f7c88ef94c285be00447b49a472d62ee70c4b74 Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.068790 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.110211 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jddnh" event={"ID":"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd","Type":"ContainerStarted","Data":"38e3d1774dbf6264443841868486acadd41cbf5d1855735693fcfd529af7260e"} Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.112727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6655497d8d-h2r8r" event={"ID":"2372174a-82ac-4421-a4e7-8ffcd7b4e92f","Type":"ContainerStarted","Data":"5beb8d9c77a85be189ef4cdb1f7c88ef94c285be00447b49a472d62ee70c4b74"} Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.114133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48ff9644-mb62f" event={"ID":"0d3a61c6-2a73-409f-b296-10f7a19685d6","Type":"ContainerStarted","Data":"129b10cacc2295344ad8aa6792c6d9d97fe79c7e582fe67edd2e076ecc066ac6"} Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.116381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7h69" event={"ID":"2646558f-772d-41e3-8079-ae80e140a23a","Type":"ContainerStarted","Data":"7df6eeafaba5dada356f67f2737d60da0144aeedf8d39cde437bac7240312815"} Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.116412 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7h69" event={"ID":"2646558f-772d-41e3-8079-ae80e140a23a","Type":"ContainerStarted","Data":"6f6fea158fb06ef8cc6a59fc85de852bd8e008d3f2b84dfaa0a0292ab6e02b05"} Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.137902 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jddnh" podStartSLOduration=6.045624914 podStartE2EDuration="1m1.13788251s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="2025-10-14 13:17:06.781809965 +0000 UTC m=+964.698809768" lastFinishedPulling="2025-10-14 13:18:01.874067531 +0000 UTC m=+1019.791067364" observedRunningTime="2025-10-14 13:18:05.129230029 +0000 UTC m=+1023.046229872" watchObservedRunningTime="2025-10-14 13:18:05.13788251 +0000 UTC m=+1023.054882323" Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.146405 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7h69" podStartSLOduration=43.146385658 podStartE2EDuration="43.146385658s" podCreationTimestamp="2025-10-14 13:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:05.145658729 +0000 UTC m=+1023.062658542" watchObservedRunningTime="2025-10-14 13:18:05.146385658 +0000 UTC m=+1023.063385471" Oct 14 13:18:05 crc kubenswrapper[4837]: I1014 13:18:05.628556 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:18:05 crc kubenswrapper[4837]: W1014 13:18:05.692082 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77623217_820b_4085_890a_d8afa93925f6.slice/crio-ebe749d9360d2e756ccd0e5f8cbeeba94145b0c773697cd211fee2cf0984e5a4 WatchSource:0}: Error finding container ebe749d9360d2e756ccd0e5f8cbeeba94145b0c773697cd211fee2cf0984e5a4: Status 404 returned error can't find the container with id ebe749d9360d2e756ccd0e5f8cbeeba94145b0c773697cd211fee2cf0984e5a4 Oct 14 13:18:06 crc kubenswrapper[4837]: I1014 13:18:06.126889 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77623217-820b-4085-890a-d8afa93925f6","Type":"ContainerStarted","Data":"ebe749d9360d2e756ccd0e5f8cbeeba94145b0c773697cd211fee2cf0984e5a4"} Oct 14 13:18:06 crc kubenswrapper[4837]: I1014 13:18:06.128735 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6655497d8d-h2r8r" event={"ID":"2372174a-82ac-4421-a4e7-8ffcd7b4e92f","Type":"ContainerStarted","Data":"b49e994a7536096b426fc73f5d5665d1e524c08924c7a054563890c3689310a7"} Oct 14 13:18:06 crc kubenswrapper[4837]: I1014 13:18:06.130336 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3b427fc-538b-4823-8ef3-8bab1765faee","Type":"ContainerStarted","Data":"cb88e62a1892c980c3120b04651c4bf852a694eb483b15b3ef58a88d1cef8227"} Oct 14 13:18:06 crc kubenswrapper[4837]: I1014 13:18:06.130378 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3b427fc-538b-4823-8ef3-8bab1765faee","Type":"ContainerStarted","Data":"5ad241804e54013bfd390e2f2d37fdcc9e46991b75a10f9affa360a0b8dd9f25"} Oct 14 13:18:06 crc kubenswrapper[4837]: I1014 13:18:06.132872 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48ff9644-mb62f" event={"ID":"0d3a61c6-2a73-409f-b296-10f7a19685d6","Type":"ContainerStarted","Data":"b6e2d16dbd379f3a29093106a023c024fd5eda678b68596a269df67ed5267d07"} Oct 14 13:18:07 crc kubenswrapper[4837]: I1014 13:18:07.145333 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77623217-820b-4085-890a-d8afa93925f6","Type":"ContainerStarted","Data":"bd05c40b9c92cf4821f125bb79d074bf17a6db4bfaa9b1a0fccee83a79fc8432"} Oct 14 13:18:07 crc kubenswrapper[4837]: I1014 13:18:07.147556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6655497d8d-h2r8r" event={"ID":"2372174a-82ac-4421-a4e7-8ffcd7b4e92f","Type":"ContainerStarted","Data":"ce844888499b913841277f9d1a2c07dd86967378d2867cfebad77ec15be8e2c0"} Oct 14 13:18:07 crc kubenswrapper[4837]: I1014 13:18:07.164028 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3b427fc-538b-4823-8ef3-8bab1765faee","Type":"ContainerStarted","Data":"7f25a0404cc944c68b93645f8ffbce8a82ecc4281bf1bcb1f9ddbaaba056010e"} Oct 14 13:18:07 crc kubenswrapper[4837]: I1014 13:18:07.183726 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6655497d8d-h2r8r" podStartSLOduration=53.067818849 podStartE2EDuration="54.183708093s" podCreationTimestamp="2025-10-14 13:17:13 +0000 UTC" firstStartedPulling="2025-10-14 13:18:04.43166933 +0000 UTC m=+1022.348669143" lastFinishedPulling="2025-10-14 13:18:05.547558574 +0000 UTC m=+1023.464558387" observedRunningTime="2025-10-14 13:18:07.180647901 +0000 UTC m=+1025.097647744" watchObservedRunningTime="2025-10-14 13:18:07.183708093 +0000 UTC m=+1025.100707906" Oct 14 13:18:07 crc kubenswrapper[4837]: I1014 13:18:07.209100 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=45.209080963 podStartE2EDuration="45.209080963s" podCreationTimestamp="2025-10-14 13:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:07.202506266 +0000 UTC m=+1025.119506079" watchObservedRunningTime="2025-10-14 13:18:07.209080963 +0000 UTC m=+1025.126080766" Oct 14 13:18:08 crc kubenswrapper[4837]: I1014 13:18:08.175772 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48ff9644-mb62f" event={"ID":"0d3a61c6-2a73-409f-b296-10f7a19685d6","Type":"ContainerStarted","Data":"6b075ecb06e07ad97b6d08bfacfe13bcb545dd1c9b55730fef82a57da09c7d22"} Oct 14 13:18:08 crc kubenswrapper[4837]: I1014 13:18:08.202386 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b48ff9644-mb62f" podStartSLOduration=52.8409711 podStartE2EDuration="54.202000646s" podCreationTimestamp="2025-10-14 13:17:14 +0000 UTC" firstStartedPulling="2025-10-14 13:18:04.356650742 +0000 UTC m=+1022.273650565" lastFinishedPulling="2025-10-14 13:18:05.717680298 +0000 UTC m=+1023.634680111" observedRunningTime="2025-10-14 13:18:08.197830615 +0000 UTC m=+1026.114830448" watchObservedRunningTime="2025-10-14 13:18:08.202000646 +0000 UTC m=+1026.119000469" Oct 14 13:18:12 crc kubenswrapper[4837]: I1014 13:18:12.715261 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:12 crc kubenswrapper[4837]: I1014 13:18:12.715605 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:12 crc kubenswrapper[4837]: I1014 13:18:12.754760 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:12 crc kubenswrapper[4837]: I1014 13:18:12.822734 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:13 crc kubenswrapper[4837]: I1014 13:18:13.216410 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:13 crc kubenswrapper[4837]: I1014 13:18:13.216451 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:14 crc kubenswrapper[4837]: I1014 13:18:14.250514 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:18:14 crc kubenswrapper[4837]: I1014 13:18:14.251405 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:18:14 crc kubenswrapper[4837]: I1014 13:18:14.410059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:18:14 crc kubenswrapper[4837]: I1014 13:18:14.410960 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:18:15 crc kubenswrapper[4837]: I1014 13:18:15.095717 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:15 crc kubenswrapper[4837]: I1014 13:18:15.232153 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:18:15 crc kubenswrapper[4837]: I1014 13:18:15.764303 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 14 13:18:20 crc kubenswrapper[4837]: I1014 13:18:20.279633 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77623217-820b-4085-890a-d8afa93925f6","Type":"ContainerStarted","Data":"1786bf6810e92d711d1cb9b12dbc7aecbbaf2c280f288f6443f32a3c411c1588"} Oct 14 13:18:21 crc kubenswrapper[4837]: I1014 13:18:21.287315 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-log" containerID="cri-o://bd05c40b9c92cf4821f125bb79d074bf17a6db4bfaa9b1a0fccee83a79fc8432" gracePeriod=30 Oct 14 13:18:21 crc kubenswrapper[4837]: I1014 13:18:21.287415 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-httpd" containerID="cri-o://1786bf6810e92d711d1cb9b12dbc7aecbbaf2c280f288f6443f32a3c411c1588" gracePeriod=30 Oct 14 13:18:21 crc kubenswrapper[4837]: I1014 13:18:21.319616 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=71.319600955 podStartE2EDuration="1m11.319600955s" podCreationTimestamp="2025-10-14 13:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:21.311535619 +0000 UTC m=+1039.228535452" watchObservedRunningTime="2025-10-14 13:18:21.319600955 +0000 UTC m=+1039.236600768" Oct 14 13:18:22 crc kubenswrapper[4837]: I1014 13:18:22.297149 4837 generic.go:334] "Generic (PLEG): container finished" podID="77623217-820b-4085-890a-d8afa93925f6" containerID="bd05c40b9c92cf4821f125bb79d074bf17a6db4bfaa9b1a0fccee83a79fc8432" exitCode=143 Oct 14 13:18:22 crc kubenswrapper[4837]: I1014 13:18:22.297206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77623217-820b-4085-890a-d8afa93925f6","Type":"ContainerDied","Data":"bd05c40b9c92cf4821f125bb79d074bf17a6db4bfaa9b1a0fccee83a79fc8432"} Oct 14 13:18:23 crc kubenswrapper[4837]: I1014 13:18:23.319345 4837 generic.go:334] "Generic (PLEG): container finished" podID="77623217-820b-4085-890a-d8afa93925f6" containerID="1786bf6810e92d711d1cb9b12dbc7aecbbaf2c280f288f6443f32a3c411c1588" exitCode=0 Oct 14 13:18:23 crc kubenswrapper[4837]: I1014 13:18:23.319637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77623217-820b-4085-890a-d8afa93925f6","Type":"ContainerDied","Data":"1786bf6810e92d711d1cb9b12dbc7aecbbaf2c280f288f6443f32a3c411c1588"} Oct 14 13:18:24 crc kubenswrapper[4837]: I1014 13:18:24.252574 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6655497d8d-h2r8r" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 14 13:18:24 crc kubenswrapper[4837]: I1014 13:18:24.413988 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b48ff9644-mb62f" podUID="0d3a61c6-2a73-409f-b296-10f7a19685d6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Oct 14 13:18:26 crc kubenswrapper[4837]: I1014 13:18:26.351796 4837 generic.go:334] "Generic (PLEG): container finished" podID="2646558f-772d-41e3-8079-ae80e140a23a" containerID="7df6eeafaba5dada356f67f2737d60da0144aeedf8d39cde437bac7240312815" exitCode=0 Oct 14 13:18:26 crc kubenswrapper[4837]: I1014 13:18:26.352912 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7h69" event={"ID":"2646558f-772d-41e3-8079-ae80e140a23a","Type":"ContainerDied","Data":"7df6eeafaba5dada356f67f2737d60da0144aeedf8d39cde437bac7240312815"} Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.338012 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.389432 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.389431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77623217-820b-4085-890a-d8afa93925f6","Type":"ContainerDied","Data":"ebe749d9360d2e756ccd0e5f8cbeeba94145b0c773697cd211fee2cf0984e5a4"} Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.389493 4837 scope.go:117] "RemoveContainer" containerID="1786bf6810e92d711d1cb9b12dbc7aecbbaf2c280f288f6443f32a3c411c1588" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406067 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-public-tls-certs\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406105 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-httpd-run\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406121 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-combined-ca-bundle\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-logs\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406193 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-config-data\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406412 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-scripts\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.406450 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzvrq\" (UniqueName: \"kubernetes.io/projected/77623217-820b-4085-890a-d8afa93925f6-kube-api-access-mzvrq\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.408635 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"77623217-820b-4085-890a-d8afa93925f6\" (UID: \"77623217-820b-4085-890a-d8afa93925f6\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.410704 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.412235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-logs" (OuterVolumeSpecName: "logs") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.412741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77623217-820b-4085-890a-d8afa93925f6-kube-api-access-mzvrq" (OuterVolumeSpecName: "kube-api-access-mzvrq") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "kube-api-access-mzvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.416329 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.420257 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzvrq\" (UniqueName: \"kubernetes.io/projected/77623217-820b-4085-890a-d8afa93925f6-kube-api-access-mzvrq\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.420313 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.420338 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.420353 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77623217-820b-4085-890a-d8afa93925f6-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.437339 4837 scope.go:117] "RemoveContainer" containerID="bd05c40b9c92cf4821f125bb79d074bf17a6db4bfaa9b1a0fccee83a79fc8432" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.457456 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-scripts" (OuterVolumeSpecName: "scripts") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.499481 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.509637 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.517764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-config-data" (OuterVolumeSpecName: "config-data") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.521878 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.521906 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.521923 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.521936 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.538662 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "77623217-820b-4085-890a-d8afa93925f6" (UID: "77623217-820b-4085-890a-d8afa93925f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.623915 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77623217-820b-4085-890a-d8afa93925f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.661336 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.729864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-config-data\") pod \"2646558f-772d-41e3-8079-ae80e140a23a\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.729916 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-credential-keys\") pod \"2646558f-772d-41e3-8079-ae80e140a23a\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.729987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-combined-ca-bundle\") pod \"2646558f-772d-41e3-8079-ae80e140a23a\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.730007 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-scripts\") pod \"2646558f-772d-41e3-8079-ae80e140a23a\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.730072 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-fernet-keys\") pod \"2646558f-772d-41e3-8079-ae80e140a23a\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.730090 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shlx9\" (UniqueName: \"kubernetes.io/projected/2646558f-772d-41e3-8079-ae80e140a23a-kube-api-access-shlx9\") pod \"2646558f-772d-41e3-8079-ae80e140a23a\" (UID: \"2646558f-772d-41e3-8079-ae80e140a23a\") " Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.740386 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2646558f-772d-41e3-8079-ae80e140a23a-kube-api-access-shlx9" (OuterVolumeSpecName: "kube-api-access-shlx9") pod "2646558f-772d-41e3-8079-ae80e140a23a" (UID: "2646558f-772d-41e3-8079-ae80e140a23a"). InnerVolumeSpecName "kube-api-access-shlx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.740450 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.755019 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.766602 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:18:27 crc kubenswrapper[4837]: E1014 13:18:27.767034 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-log" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767054 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-log" Oct 14 13:18:27 crc kubenswrapper[4837]: E1014 13:18:27.767070 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="init" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767078 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="init" Oct 14 13:18:27 crc kubenswrapper[4837]: E1014 13:18:27.767098 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2646558f-772d-41e3-8079-ae80e140a23a" containerName="keystone-bootstrap" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767105 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2646558f-772d-41e3-8079-ae80e140a23a" containerName="keystone-bootstrap" Oct 14 13:18:27 crc kubenswrapper[4837]: E1014 13:18:27.767118 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-httpd" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767125 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-httpd" Oct 14 13:18:27 crc kubenswrapper[4837]: E1014 13:18:27.767143 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767150 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767372 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-httpd" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767396 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="77623217-820b-4085-890a-d8afa93925f6" containerName="glance-log" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767405 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2188251b-90b0-46fe-9bb2-5b16cd3d1dac" containerName="dnsmasq-dns" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.767420 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2646558f-772d-41e3-8079-ae80e140a23a" containerName="keystone-bootstrap" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.768443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.773852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2646558f-772d-41e3-8079-ae80e140a23a" (UID: "2646558f-772d-41e3-8079-ae80e140a23a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.775186 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.776486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-scripts" (OuterVolumeSpecName: "scripts") pod "2646558f-772d-41e3-8079-ae80e140a23a" (UID: "2646558f-772d-41e3-8079-ae80e140a23a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.777814 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.780008 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.781197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2646558f-772d-41e3-8079-ae80e140a23a" (UID: "2646558f-772d-41e3-8079-ae80e140a23a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.781789 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2646558f-772d-41e3-8079-ae80e140a23a" (UID: "2646558f-772d-41e3-8079-ae80e140a23a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.810189 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-config-data" (OuterVolumeSpecName: "config-data") pod "2646558f-772d-41e3-8079-ae80e140a23a" (UID: "2646558f-772d-41e3-8079-ae80e140a23a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832431 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832533 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832678 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832702 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832757 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvrhs\" (UniqueName: \"kubernetes.io/projected/c72488b0-037a-4284-b428-e1907b3aa9ae-kube-api-access-qvrhs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832799 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-logs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832844 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832919 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832964 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832977 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.832988 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.833000 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2646558f-772d-41e3-8079-ae80e140a23a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.833034 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shlx9\" (UniqueName: \"kubernetes.io/projected/2646558f-772d-41e3-8079-ae80e140a23a-kube-api-access-shlx9\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-logs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934245 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934309 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934329 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934359 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934380 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvrhs\" (UniqueName: \"kubernetes.io/projected/c72488b0-037a-4284-b428-e1907b3aa9ae-kube-api-access-qvrhs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934494 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-logs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.934684 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.939127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.939805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.943720 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.944460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.952021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvrhs\" (UniqueName: \"kubernetes.io/projected/c72488b0-037a-4284-b428-e1907b3aa9ae-kube-api-access-qvrhs\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:27 crc kubenswrapper[4837]: I1014 13:18:27.959564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " pod="openstack/glance-default-external-api-0" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.092645 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.416686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7h69" event={"ID":"2646558f-772d-41e3-8079-ae80e140a23a","Type":"ContainerDied","Data":"6f6fea158fb06ef8cc6a59fc85de852bd8e008d3f2b84dfaa0a0292ab6e02b05"} Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.416904 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f6fea158fb06ef8cc6a59fc85de852bd8e008d3f2b84dfaa0a0292ab6e02b05" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.416864 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7h69" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.425615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b69bq" event={"ID":"6d33f0a5-b130-4614-9636-fa0d61fa4e11","Type":"ContainerStarted","Data":"4d186e9c62731590c1feb726579528333c986b72bce2aa320d81395f9be8ff4c"} Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.442046 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d55e59-7431-474a-a2eb-be646017f3c2","Type":"ContainerStarted","Data":"c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177"} Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.446729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5j7sn" event={"ID":"14d2b3ef-eed6-48cb-948b-3618d6f53fff","Type":"ContainerStarted","Data":"c63042a8cb32ba40653be97d76b7967d87b91a763bd979e762f1d82d75694dd3"} Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.459752 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-b69bq" podStartSLOduration=4.154507143 podStartE2EDuration="1m24.459734826s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="2025-10-14 13:17:06.802442039 +0000 UTC m=+964.719441852" lastFinishedPulling="2025-10-14 13:18:27.107669722 +0000 UTC m=+1045.024669535" observedRunningTime="2025-10-14 13:18:28.457794415 +0000 UTC m=+1046.374794228" watchObservedRunningTime="2025-10-14 13:18:28.459734826 +0000 UTC m=+1046.376734639" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.489610 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bcd589b8f-ljfsq"] Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.490570 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.493682 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.493733 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.493703 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6z55" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.493797 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.493945 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.494021 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.503519 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bcd589b8f-ljfsq"] Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.530983 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-5j7sn" podStartSLOduration=3.389328666 podStartE2EDuration="1m24.530960173s" podCreationTimestamp="2025-10-14 13:17:04 +0000 UTC" firstStartedPulling="2025-10-14 13:17:05.970478453 +0000 UTC m=+963.887478266" lastFinishedPulling="2025-10-14 13:18:27.11210996 +0000 UTC m=+1045.029109773" observedRunningTime="2025-10-14 13:18:28.511850952 +0000 UTC m=+1046.428850775" watchObservedRunningTime="2025-10-14 13:18:28.530960173 +0000 UTC m=+1046.447959986" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-config-data\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-combined-ca-bundle\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-credential-keys\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-fernet-keys\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545738 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-scripts\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545955 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wcnv\" (UniqueName: \"kubernetes.io/projected/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-kube-api-access-7wcnv\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.545982 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-public-tls-certs\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.546021 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-internal-tls-certs\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.644993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.647837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-config-data\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.647907 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-combined-ca-bundle\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.647935 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-credential-keys\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.647976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-fernet-keys\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.647999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-scripts\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.648023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wcnv\" (UniqueName: \"kubernetes.io/projected/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-kube-api-access-7wcnv\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.648045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-public-tls-certs\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.648077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-internal-tls-certs\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.654079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-fernet-keys\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.655234 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-internal-tls-certs\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.656086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-scripts\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.657120 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-config-data\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.658376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-credential-keys\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.659501 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-combined-ca-bundle\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: W1014 13:18:28.662543 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc72488b0_037a_4284_b428_e1907b3aa9ae.slice/crio-528b724c93f160c0d15cf8173200a96563091d365af2d45cba29616e47f6b4b4 WatchSource:0}: Error finding container 528b724c93f160c0d15cf8173200a96563091d365af2d45cba29616e47f6b4b4: Status 404 returned error can't find the container with id 528b724c93f160c0d15cf8173200a96563091d365af2d45cba29616e47f6b4b4 Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.662650 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-public-tls-certs\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.677327 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wcnv\" (UniqueName: \"kubernetes.io/projected/9dd1fb1b-4520-43c9-8a24-fd0a225856a3-kube-api-access-7wcnv\") pod \"keystone-7bcd589b8f-ljfsq\" (UID: \"9dd1fb1b-4520-43c9-8a24-fd0a225856a3\") " pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.796988 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77623217-820b-4085-890a-d8afa93925f6" path="/var/lib/kubelet/pods/77623217-820b-4085-890a-d8afa93925f6/volumes" Oct 14 13:18:28 crc kubenswrapper[4837]: I1014 13:18:28.822761 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:29 crc kubenswrapper[4837]: I1014 13:18:29.338683 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bcd589b8f-ljfsq"] Oct 14 13:18:29 crc kubenswrapper[4837]: W1014 13:18:29.357416 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dd1fb1b_4520_43c9_8a24_fd0a225856a3.slice/crio-f659a67251d26870c80a6bd4623c1249fbf21b09cc0599933e4f8d15f3a79431 WatchSource:0}: Error finding container f659a67251d26870c80a6bd4623c1249fbf21b09cc0599933e4f8d15f3a79431: Status 404 returned error can't find the container with id f659a67251d26870c80a6bd4623c1249fbf21b09cc0599933e4f8d15f3a79431 Oct 14 13:18:29 crc kubenswrapper[4837]: I1014 13:18:29.462128 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c72488b0-037a-4284-b428-e1907b3aa9ae","Type":"ContainerStarted","Data":"91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6"} Oct 14 13:18:29 crc kubenswrapper[4837]: I1014 13:18:29.462200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c72488b0-037a-4284-b428-e1907b3aa9ae","Type":"ContainerStarted","Data":"528b724c93f160c0d15cf8173200a96563091d365af2d45cba29616e47f6b4b4"} Oct 14 13:18:29 crc kubenswrapper[4837]: I1014 13:18:29.467173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bcd589b8f-ljfsq" event={"ID":"9dd1fb1b-4520-43c9-8a24-fd0a225856a3","Type":"ContainerStarted","Data":"f659a67251d26870c80a6bd4623c1249fbf21b09cc0599933e4f8d15f3a79431"} Oct 14 13:18:30 crc kubenswrapper[4837]: I1014 13:18:30.477791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c72488b0-037a-4284-b428-e1907b3aa9ae","Type":"ContainerStarted","Data":"459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278"} Oct 14 13:18:30 crc kubenswrapper[4837]: I1014 13:18:30.480515 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bcd589b8f-ljfsq" event={"ID":"9dd1fb1b-4520-43c9-8a24-fd0a225856a3","Type":"ContainerStarted","Data":"8c3a9a31cc2a9bdd4ddf9030c4ce78c67bda730602ba9b17f1bb40dcc399967c"} Oct 14 13:18:30 crc kubenswrapper[4837]: I1014 13:18:30.480648 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:18:30 crc kubenswrapper[4837]: I1014 13:18:30.497014 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.496996331 podStartE2EDuration="3.496996331s" podCreationTimestamp="2025-10-14 13:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:30.493840415 +0000 UTC m=+1048.410840238" watchObservedRunningTime="2025-10-14 13:18:30.496996331 +0000 UTC m=+1048.413996144" Oct 14 13:18:30 crc kubenswrapper[4837]: I1014 13:18:30.516727 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bcd589b8f-ljfsq" podStartSLOduration=2.516712628 podStartE2EDuration="2.516712628s" podCreationTimestamp="2025-10-14 13:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:30.516638456 +0000 UTC m=+1048.433638279" watchObservedRunningTime="2025-10-14 13:18:30.516712628 +0000 UTC m=+1048.433712441" Oct 14 13:18:36 crc kubenswrapper[4837]: I1014 13:18:36.047603 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:18:37 crc kubenswrapper[4837]: I1014 13:18:37.247888 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:18:37 crc kubenswrapper[4837]: I1014 13:18:37.833698 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:18:38 crc kubenswrapper[4837]: I1014 13:18:38.093640 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 14 13:18:38 crc kubenswrapper[4837]: I1014 13:18:38.095183 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 14 13:18:38 crc kubenswrapper[4837]: I1014 13:18:38.121704 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 14 13:18:38 crc kubenswrapper[4837]: I1014 13:18:38.141567 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 14 13:18:38 crc kubenswrapper[4837]: I1014 13:18:38.554558 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 14 13:18:38 crc kubenswrapper[4837]: I1014 13:18:38.554790 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 14 13:18:39 crc kubenswrapper[4837]: I1014 13:18:39.008109 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b48ff9644-mb62f" Oct 14 13:18:39 crc kubenswrapper[4837]: I1014 13:18:39.070345 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6655497d8d-h2r8r"] Oct 14 13:18:39 crc kubenswrapper[4837]: I1014 13:18:39.070615 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6655497d8d-h2r8r" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon-log" containerID="cri-o://b49e994a7536096b426fc73f5d5665d1e524c08924c7a054563890c3689310a7" gracePeriod=30 Oct 14 13:18:39 crc kubenswrapper[4837]: I1014 13:18:39.070763 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6655497d8d-h2r8r" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" containerID="cri-o://ce844888499b913841277f9d1a2c07dd86967378d2867cfebad77ec15be8e2c0" gracePeriod=30 Oct 14 13:18:40 crc kubenswrapper[4837]: I1014 13:18:40.519050 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 14 13:18:40 crc kubenswrapper[4837]: I1014 13:18:40.573185 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:18:40 crc kubenswrapper[4837]: I1014 13:18:40.911502 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 14 13:18:41 crc kubenswrapper[4837]: I1014 13:18:41.139875 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:18:41 crc kubenswrapper[4837]: I1014 13:18:41.139929 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:18:41 crc kubenswrapper[4837]: E1014 13:18:41.913116 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 14 13:18:41 crc kubenswrapper[4837]: E1014 13:18:41.913345 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wc76j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(58d55e59-7431-474a-a2eb-be646017f3c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 14 13:18:41 crc kubenswrapper[4837]: E1014 13:18:41.914606 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage2398971497/1\\\": happened during read: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="58d55e59-7431-474a-a2eb-be646017f3c2" Oct 14 13:18:42 crc kubenswrapper[4837]: I1014 13:18:42.601476 4837 generic.go:334] "Generic (PLEG): container finished" podID="d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" containerID="38e3d1774dbf6264443841868486acadd41cbf5d1855735693fcfd529af7260e" exitCode=0 Oct 14 13:18:42 crc kubenswrapper[4837]: I1014 13:18:42.602011 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jddnh" event={"ID":"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd","Type":"ContainerDied","Data":"38e3d1774dbf6264443841868486acadd41cbf5d1855735693fcfd529af7260e"} Oct 14 13:18:42 crc kubenswrapper[4837]: I1014 13:18:42.608143 4837 generic.go:334] "Generic (PLEG): container finished" podID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerID="ce844888499b913841277f9d1a2c07dd86967378d2867cfebad77ec15be8e2c0" exitCode=0 Oct 14 13:18:42 crc kubenswrapper[4837]: I1014 13:18:42.608228 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6655497d8d-h2r8r" event={"ID":"2372174a-82ac-4421-a4e7-8ffcd7b4e92f","Type":"ContainerDied","Data":"ce844888499b913841277f9d1a2c07dd86967378d2867cfebad77ec15be8e2c0"} Oct 14 13:18:42 crc kubenswrapper[4837]: I1014 13:18:42.608455 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d55e59-7431-474a-a2eb-be646017f3c2" containerName="sg-core" containerID="cri-o://c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177" gracePeriod=30 Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.037263 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.119718 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-config-data\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.119805 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-scripts\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.119852 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-log-httpd\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.119903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-sg-core-conf-yaml\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.119974 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-run-httpd\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.120036 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc76j\" (UniqueName: \"kubernetes.io/projected/58d55e59-7431-474a-a2eb-be646017f3c2-kube-api-access-wc76j\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.120054 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-combined-ca-bundle\") pod \"58d55e59-7431-474a-a2eb-be646017f3c2\" (UID: \"58d55e59-7431-474a-a2eb-be646017f3c2\") " Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.120548 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.120757 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.126411 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-scripts" (OuterVolumeSpecName: "scripts") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.126899 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d55e59-7431-474a-a2eb-be646017f3c2-kube-api-access-wc76j" (OuterVolumeSpecName: "kube-api-access-wc76j") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "kube-api-access-wc76j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.127220 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-config-data" (OuterVolumeSpecName: "config-data") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.132305 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.150195 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "58d55e59-7431-474a-a2eb-be646017f3c2" (UID: "58d55e59-7431-474a-a2eb-be646017f3c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221341 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221576 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc76j\" (UniqueName: \"kubernetes.io/projected/58d55e59-7431-474a-a2eb-be646017f3c2-kube-api-access-wc76j\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221586 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221594 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221602 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221609 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d55e59-7431-474a-a2eb-be646017f3c2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.221617 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d55e59-7431-474a-a2eb-be646017f3c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.621217 4837 generic.go:334] "Generic (PLEG): container finished" podID="58d55e59-7431-474a-a2eb-be646017f3c2" containerID="c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177" exitCode=2 Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.621346 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d55e59-7431-474a-a2eb-be646017f3c2","Type":"ContainerDied","Data":"c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177"} Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.621419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d55e59-7431-474a-a2eb-be646017f3c2","Type":"ContainerDied","Data":"cd93f4de982a838533f170059bfcde8037702b62f379542706a4d58500e7725b"} Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.621452 4837 scope.go:117] "RemoveContainer" containerID="c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.621624 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.682135 4837 scope.go:117] "RemoveContainer" containerID="c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177" Oct 14 13:18:43 crc kubenswrapper[4837]: E1014 13:18:43.682845 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177\": container with ID starting with c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177 not found: ID does not exist" containerID="c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.682986 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177"} err="failed to get container status \"c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177\": rpc error: code = NotFound desc = could not find container \"c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177\": container with ID starting with c3e7dec43cceee30b2700d8a1a2bb88712092bb4af76c2750dc484d9b2265177 not found: ID does not exist" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.734840 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.743865 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.753293 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:18:43 crc kubenswrapper[4837]: E1014 13:18:43.753664 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d55e59-7431-474a-a2eb-be646017f3c2" containerName="sg-core" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.753687 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d55e59-7431-474a-a2eb-be646017f3c2" containerName="sg-core" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.754031 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d55e59-7431-474a-a2eb-be646017f3c2" containerName="sg-core" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.757016 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.759133 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.760451 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.761878 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67mg\" (UniqueName: \"kubernetes.io/projected/148e6967-e15d-4c5c-89db-5a029e0ce45b-kube-api-access-p67mg\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-run-httpd\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834630 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-config-data\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834841 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-scripts\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.834867 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-log-httpd\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936493 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-config-data\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936584 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936620 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-scripts\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936646 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-log-httpd\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p67mg\" (UniqueName: \"kubernetes.io/projected/148e6967-e15d-4c5c-89db-5a029e0ce45b-kube-api-access-p67mg\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-run-httpd\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.936776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.937306 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-log-httpd\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.937576 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-run-httpd\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.942077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.942078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-scripts\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.942352 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.943631 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-config-data\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:43 crc kubenswrapper[4837]: I1014 13:18:43.953568 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67mg\" (UniqueName: \"kubernetes.io/projected/148e6967-e15d-4c5c-89db-5a029e0ce45b-kube-api-access-p67mg\") pod \"ceilometer-0\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " pod="openstack/ceilometer-0" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.031567 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jddnh" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.089869 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.140937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-scripts\") pod \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.141047 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-logs\") pod \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.141106 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-config-data\") pod \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.141200 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-combined-ca-bundle\") pod \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.141244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whw4d\" (UniqueName: \"kubernetes.io/projected/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-kube-api-access-whw4d\") pod \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\" (UID: \"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd\") " Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.142238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-logs" (OuterVolumeSpecName: "logs") pod "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" (UID: "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.144766 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-kube-api-access-whw4d" (OuterVolumeSpecName: "kube-api-access-whw4d") pod "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" (UID: "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd"). InnerVolumeSpecName "kube-api-access-whw4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.145959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-scripts" (OuterVolumeSpecName: "scripts") pod "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" (UID: "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.170847 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-config-data" (OuterVolumeSpecName: "config-data") pod "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" (UID: "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.176935 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" (UID: "d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.243360 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.243391 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.243402 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.243413 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whw4d\" (UniqueName: \"kubernetes.io/projected/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-kube-api-access-whw4d\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.243422 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.251214 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6655497d8d-h2r8r" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 14 13:18:44 crc kubenswrapper[4837]: W1014 13:18:44.510094 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148e6967_e15d_4c5c_89db_5a029e0ce45b.slice/crio-6d525a0e74ce462b8965d81f151beb4df1d6fd0e26672c765206914758e4517a WatchSource:0}: Error finding container 6d525a0e74ce462b8965d81f151beb4df1d6fd0e26672c765206914758e4517a: Status 404 returned error can't find the container with id 6d525a0e74ce462b8965d81f151beb4df1d6fd0e26672c765206914758e4517a Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.513689 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.634747 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jddnh" event={"ID":"d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd","Type":"ContainerDied","Data":"fe0c2f37085b5968c6d176c2d4391ce8b3120e8cfb49f49fad6f16896876023b"} Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.634790 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0c2f37085b5968c6d176c2d4391ce8b3120e8cfb49f49fad6f16896876023b" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.634918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jddnh" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.637900 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerStarted","Data":"6d525a0e74ce462b8965d81f151beb4df1d6fd0e26672c765206914758e4517a"} Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.749432 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67c99f9644-lpk76"] Oct 14 13:18:44 crc kubenswrapper[4837]: E1014 13:18:44.749989 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" containerName="placement-db-sync" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.750028 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" containerName="placement-db-sync" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.750346 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" containerName="placement-db-sync" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.751535 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.754515 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.754747 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.754945 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.755120 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.755652 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k4rzv" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.771712 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67c99f9644-lpk76"] Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.803728 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d55e59-7431-474a-a2eb-be646017f3c2" path="/var/lib/kubelet/pods/58d55e59-7431-474a-a2eb-be646017f3c2/volumes" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.853481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-scripts\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.853555 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncv8x\" (UniqueName: \"kubernetes.io/projected/3be94ea9-34d4-4765-92cf-93345cfb88bb-kube-api-access-ncv8x\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.853614 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-public-tls-certs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.853763 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-combined-ca-bundle\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.854000 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be94ea9-34d4-4765-92cf-93345cfb88bb-logs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.854052 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-config-data\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.854121 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-internal-tls-certs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.955766 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-internal-tls-certs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.955825 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-scripts\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.955858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncv8x\" (UniqueName: \"kubernetes.io/projected/3be94ea9-34d4-4765-92cf-93345cfb88bb-kube-api-access-ncv8x\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.955910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-public-tls-certs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.955957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-combined-ca-bundle\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.955989 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be94ea9-34d4-4765-92cf-93345cfb88bb-logs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.956011 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-config-data\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.957894 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be94ea9-34d4-4765-92cf-93345cfb88bb-logs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.961974 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-config-data\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.961979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-internal-tls-certs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.963130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-scripts\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.965248 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-combined-ca-bundle\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.973787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be94ea9-34d4-4765-92cf-93345cfb88bb-public-tls-certs\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:44 crc kubenswrapper[4837]: I1014 13:18:44.977644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncv8x\" (UniqueName: \"kubernetes.io/projected/3be94ea9-34d4-4765-92cf-93345cfb88bb-kube-api-access-ncv8x\") pod \"placement-67c99f9644-lpk76\" (UID: \"3be94ea9-34d4-4765-92cf-93345cfb88bb\") " pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:45 crc kubenswrapper[4837]: I1014 13:18:45.089572 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:45 crc kubenswrapper[4837]: I1014 13:18:45.568555 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67c99f9644-lpk76"] Oct 14 13:18:45 crc kubenswrapper[4837]: W1014 13:18:45.577636 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3be94ea9_34d4_4765_92cf_93345cfb88bb.slice/crio-e2cc482b9c309de681e68e7e22616751deb7e17051c38a909f723273e726009e WatchSource:0}: Error finding container e2cc482b9c309de681e68e7e22616751deb7e17051c38a909f723273e726009e: Status 404 returned error can't find the container with id e2cc482b9c309de681e68e7e22616751deb7e17051c38a909f723273e726009e Oct 14 13:18:45 crc kubenswrapper[4837]: I1014 13:18:45.648602 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c99f9644-lpk76" event={"ID":"3be94ea9-34d4-4765-92cf-93345cfb88bb","Type":"ContainerStarted","Data":"e2cc482b9c309de681e68e7e22616751deb7e17051c38a909f723273e726009e"} Oct 14 13:18:45 crc kubenswrapper[4837]: I1014 13:18:45.651970 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" containerID="4d186e9c62731590c1feb726579528333c986b72bce2aa320d81395f9be8ff4c" exitCode=0 Oct 14 13:18:45 crc kubenswrapper[4837]: I1014 13:18:45.652044 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b69bq" event={"ID":"6d33f0a5-b130-4614-9636-fa0d61fa4e11","Type":"ContainerDied","Data":"4d186e9c62731590c1feb726579528333c986b72bce2aa320d81395f9be8ff4c"} Oct 14 13:18:45 crc kubenswrapper[4837]: I1014 13:18:45.654921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerStarted","Data":"0bde1f81d042890b7a3b5b737d24072929d61912234fda517a5f54771adaba5f"} Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.665460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c99f9644-lpk76" event={"ID":"3be94ea9-34d4-4765-92cf-93345cfb88bb","Type":"ContainerStarted","Data":"6220221e8753f4511e56f2486d8e873da9e65a60d87a7a77e00a1fe7661dad45"} Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.665898 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.665913 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.665922 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c99f9644-lpk76" event={"ID":"3be94ea9-34d4-4765-92cf-93345cfb88bb","Type":"ContainerStarted","Data":"c6c9d13ceecea8b60e732fcd67404cdcc3bbf8f7057ce696f51a8c845cf47200"} Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.668730 4837 generic.go:334] "Generic (PLEG): container finished" podID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" containerID="c63042a8cb32ba40653be97d76b7967d87b91a763bd979e762f1d82d75694dd3" exitCode=0 Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.668827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5j7sn" event={"ID":"14d2b3ef-eed6-48cb-948b-3618d6f53fff","Type":"ContainerDied","Data":"c63042a8cb32ba40653be97d76b7967d87b91a763bd979e762f1d82d75694dd3"} Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.703886 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67c99f9644-lpk76" podStartSLOduration=2.703864486 podStartE2EDuration="2.703864486s" podCreationTimestamp="2025-10-14 13:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:46.694537727 +0000 UTC m=+1064.611537530" watchObservedRunningTime="2025-10-14 13:18:46.703864486 +0000 UTC m=+1064.620864309" Oct 14 13:18:46 crc kubenswrapper[4837]: I1014 13:18:46.985100 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b69bq" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.110801 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-combined-ca-bundle\") pod \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.111190 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-db-sync-config-data\") pod \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.111370 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kz99\" (UniqueName: \"kubernetes.io/projected/6d33f0a5-b130-4614-9636-fa0d61fa4e11-kube-api-access-6kz99\") pod \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\" (UID: \"6d33f0a5-b130-4614-9636-fa0d61fa4e11\") " Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.116266 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d33f0a5-b130-4614-9636-fa0d61fa4e11" (UID: "6d33f0a5-b130-4614-9636-fa0d61fa4e11"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.116395 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d33f0a5-b130-4614-9636-fa0d61fa4e11-kube-api-access-6kz99" (OuterVolumeSpecName: "kube-api-access-6kz99") pod "6d33f0a5-b130-4614-9636-fa0d61fa4e11" (UID: "6d33f0a5-b130-4614-9636-fa0d61fa4e11"). InnerVolumeSpecName "kube-api-access-6kz99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.138233 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d33f0a5-b130-4614-9636-fa0d61fa4e11" (UID: "6d33f0a5-b130-4614-9636-fa0d61fa4e11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.213740 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kz99\" (UniqueName: \"kubernetes.io/projected/6d33f0a5-b130-4614-9636-fa0d61fa4e11-kube-api-access-6kz99\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.213772 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.213783 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d33f0a5-b130-4614-9636-fa0d61fa4e11-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.679803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b69bq" event={"ID":"6d33f0a5-b130-4614-9636-fa0d61fa4e11","Type":"ContainerDied","Data":"bdefe6a3f10b04c1e9bfe9e6260b260001fd7da4fe00d421ef8233c1b02f02a7"} Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.679842 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdefe6a3f10b04c1e9bfe9e6260b260001fd7da4fe00d421ef8233c1b02f02a7" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.679908 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b69bq" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.686175 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerStarted","Data":"2593160a9c4c902c162deaa393413d006bee9d82c3b266a59a75d4e584cfa929"} Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.952371 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.964224 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-64dbf95879-s4jqv"] Oct 14 13:18:47 crc kubenswrapper[4837]: E1014 13:18:47.964655 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" containerName="cinder-db-sync" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.964670 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" containerName="cinder-db-sync" Oct 14 13:18:47 crc kubenswrapper[4837]: E1014 13:18:47.964707 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" containerName="barbican-db-sync" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.964716 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" containerName="barbican-db-sync" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.964932 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" containerName="barbican-db-sync" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.964957 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" containerName="cinder-db-sync" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.966058 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.976622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.976871 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.976903 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4s92n" Oct 14 13:18:47 crc kubenswrapper[4837]: I1014 13:18:47.983652 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-64dbf95879-s4jqv"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.006412 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5dd7b6957d-hqts4"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.007852 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.011598 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.064229 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vcc\" (UniqueName: \"kubernetes.io/projected/14d2b3ef-eed6-48cb-948b-3618d6f53fff-kube-api-access-x7vcc\") pod \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.064371 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-scripts\") pod \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.064407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-db-sync-config-data\") pod \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.064472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14d2b3ef-eed6-48cb-948b-3618d6f53fff-etc-machine-id\") pod \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.064518 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-combined-ca-bundle\") pod \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.064560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-config-data\") pod \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\" (UID: \"14d2b3ef-eed6-48cb-948b-3618d6f53fff\") " Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.065605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42a5890-7561-4b99-9518-0c6c672217d9-logs\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.065673 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-combined-ca-bundle\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.065717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-config-data\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.065793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52sg\" (UniqueName: \"kubernetes.io/projected/d42a5890-7561-4b99-9518-0c6c672217d9-kube-api-access-d52sg\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.065838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-config-data-custom\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.070904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14d2b3ef-eed6-48cb-948b-3618d6f53fff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14d2b3ef-eed6-48cb-948b-3618d6f53fff" (UID: "14d2b3ef-eed6-48cb-948b-3618d6f53fff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.075133 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-scripts" (OuterVolumeSpecName: "scripts") pod "14d2b3ef-eed6-48cb-948b-3618d6f53fff" (UID: "14d2b3ef-eed6-48cb-948b-3618d6f53fff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.091188 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14d2b3ef-eed6-48cb-948b-3618d6f53fff" (UID: "14d2b3ef-eed6-48cb-948b-3618d6f53fff"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.094806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d2b3ef-eed6-48cb-948b-3618d6f53fff-kube-api-access-x7vcc" (OuterVolumeSpecName: "kube-api-access-x7vcc") pod "14d2b3ef-eed6-48cb-948b-3618d6f53fff" (UID: "14d2b3ef-eed6-48cb-948b-3618d6f53fff"). InnerVolumeSpecName "kube-api-access-x7vcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.095567 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5dd7b6957d-hqts4"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.130582 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-tpfp2"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.132082 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.173155 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-config-data\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.173573 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42a5890-7561-4b99-9518-0c6c672217d9-logs\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.173925 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-config-data-custom\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.173985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-combined-ca-bundle\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-config-data\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174040 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52b1001-3fb0-415b-be6a-e55a548462ac-logs\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52sg\" (UniqueName: \"kubernetes.io/projected/d42a5890-7561-4b99-9518-0c6c672217d9-kube-api-access-d52sg\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-combined-ca-bundle\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2b5\" (UniqueName: \"kubernetes.io/projected/e52b1001-3fb0-415b-be6a-e55a548462ac-kube-api-access-cm2b5\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174288 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d42a5890-7561-4b99-9518-0c6c672217d9-logs\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.174816 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-config-data-custom\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.176483 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vcc\" (UniqueName: \"kubernetes.io/projected/14d2b3ef-eed6-48cb-948b-3618d6f53fff-kube-api-access-x7vcc\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.176527 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.176538 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.176548 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14d2b3ef-eed6-48cb-948b-3618d6f53fff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.182610 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-config-data-custom\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.182743 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-config-data" (OuterVolumeSpecName: "config-data") pod "14d2b3ef-eed6-48cb-948b-3618d6f53fff" (UID: "14d2b3ef-eed6-48cb-948b-3618d6f53fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.190079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-combined-ca-bundle\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.190166 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-tpfp2"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.191683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d42a5890-7561-4b99-9518-0c6c672217d9-config-data\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.198282 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14d2b3ef-eed6-48cb-948b-3618d6f53fff" (UID: "14d2b3ef-eed6-48cb-948b-3618d6f53fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.200601 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52sg\" (UniqueName: \"kubernetes.io/projected/d42a5890-7561-4b99-9518-0c6c672217d9-kube-api-access-d52sg\") pod \"barbican-worker-64dbf95879-s4jqv\" (UID: \"d42a5890-7561-4b99-9518-0c6c672217d9\") " pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.211748 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56b5874f78-zwqfr"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.213250 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.216610 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56b5874f78-zwqfr"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.216772 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.278573 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-config-data\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.278641 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-config\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279736 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279784 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-config-data-custom\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279864 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52b1001-3fb0-415b-be6a-e55a548462ac-logs\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279928 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-combined-ca-bundle\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.279946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2b5\" (UniqueName: \"kubernetes.io/projected/e52b1001-3fb0-415b-be6a-e55a548462ac-kube-api-access-cm2b5\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.280001 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.280018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6cm7\" (UniqueName: \"kubernetes.io/projected/1cbdae00-eb75-411e-8f1f-9d6a05d64628-kube-api-access-d6cm7\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.280066 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.280081 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d2b3ef-eed6-48cb-948b-3618d6f53fff-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.280893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52b1001-3fb0-415b-be6a-e55a548462ac-logs\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.284279 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-combined-ca-bundle\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.286474 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-config-data\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.289051 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e52b1001-3fb0-415b-be6a-e55a548462ac-config-data-custom\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.302371 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2b5\" (UniqueName: \"kubernetes.io/projected/e52b1001-3fb0-415b-be6a-e55a548462ac-kube-api-access-cm2b5\") pod \"barbican-keystone-listener-5dd7b6957d-hqts4\" (UID: \"e52b1001-3fb0-415b-be6a-e55a548462ac\") " pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.305483 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64dbf95879-s4jqv" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.347198 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.381790 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-combined-ca-bundle\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.381911 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a827d62-4407-44f4-afc4-b21c06888c13-logs\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.381969 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.381997 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382019 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6cm7\" (UniqueName: \"kubernetes.io/projected/1cbdae00-eb75-411e-8f1f-9d6a05d64628-kube-api-access-d6cm7\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-config\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382075 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382091 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382117 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data-custom\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.382180 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxpv4\" (UniqueName: \"kubernetes.io/projected/2a827d62-4407-44f4-afc4-b21c06888c13-kube-api-access-wxpv4\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.383038 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.384325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-config\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.384936 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.385483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.386077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.415087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6cm7\" (UniqueName: \"kubernetes.io/projected/1cbdae00-eb75-411e-8f1f-9d6a05d64628-kube-api-access-d6cm7\") pod \"dnsmasq-dns-6d66f584d7-tpfp2\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.468247 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.483276 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a827d62-4407-44f4-afc4-b21c06888c13-logs\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.483578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.483622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data-custom\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.483655 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxpv4\" (UniqueName: \"kubernetes.io/projected/2a827d62-4407-44f4-afc4-b21c06888c13-kube-api-access-wxpv4\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.483697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-combined-ca-bundle\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.483841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a827d62-4407-44f4-afc4-b21c06888c13-logs\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.487783 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-combined-ca-bundle\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.489798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data-custom\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.490931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.506256 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxpv4\" (UniqueName: \"kubernetes.io/projected/2a827d62-4407-44f4-afc4-b21c06888c13-kube-api-access-wxpv4\") pod \"barbican-api-56b5874f78-zwqfr\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.533236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.701501 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerStarted","Data":"c58ba7655a29419ccb05230c455b8c766bbb730cb600598ff0094a57dffba076"} Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.710310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5j7sn" event={"ID":"14d2b3ef-eed6-48cb-948b-3618d6f53fff","Type":"ContainerDied","Data":"ddcdcf894de7560cb758f53f36aa03f36836b12c6aa96a97453b8dd5b210cdc9"} Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.710354 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddcdcf894de7560cb758f53f36aa03f36836b12c6aa96a97453b8dd5b210cdc9" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.710438 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5j7sn" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.817281 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-64dbf95879-s4jqv"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.934939 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.939428 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.941577 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.941729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-j45rh" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.941857 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.942786 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.946486 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:18:48 crc kubenswrapper[4837]: I1014 13:18:48.968255 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5dd7b6957d-hqts4"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.004306 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-tpfp2"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.016649 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fc0917b-b4a9-4941-9dc1-199cb056d851-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.016725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.016751 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.016796 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.016854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8fl\" (UniqueName: \"kubernetes.io/projected/1fc0917b-b4a9-4941-9dc1-199cb056d851-kube-api-access-gg8fl\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.016901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-scripts\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.032563 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-tpfp2"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.062307 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-674b76c99f-chvs9"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.064012 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.072094 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-674b76c99f-chvs9"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118427 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118476 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118510 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118562 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8fl\" (UniqueName: \"kubernetes.io/projected/1fc0917b-b4a9-4941-9dc1-199cb056d851-kube-api-access-gg8fl\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlprp\" (UniqueName: \"kubernetes.io/projected/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-kube-api-access-xlprp\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-scripts\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118644 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-config\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118678 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-svc\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-nb\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118751 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-swift-storage-0\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fc0917b-b4a9-4941-9dc1-199cb056d851-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-sb\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.118903 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fc0917b-b4a9-4941-9dc1-199cb056d851-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.123633 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-scripts\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.124700 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.126461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.127377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.140865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8fl\" (UniqueName: \"kubernetes.io/projected/1fc0917b-b4a9-4941-9dc1-199cb056d851-kube-api-access-gg8fl\") pod \"cinder-scheduler-0\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.160570 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56b5874f78-zwqfr"] Oct 14 13:18:49 crc kubenswrapper[4837]: W1014 13:18:49.170690 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a827d62_4407_44f4_afc4_b21c06888c13.slice/crio-037a12ddd13f19e5e315c4fbd00959719382628576b53581bbf5564c8e7011c0 WatchSource:0}: Error finding container 037a12ddd13f19e5e315c4fbd00959719382628576b53581bbf5564c8e7011c0: Status 404 returned error can't find the container with id 037a12ddd13f19e5e315c4fbd00959719382628576b53581bbf5564c8e7011c0 Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.220984 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-config\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.222059 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-config\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.222151 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-svc\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.222194 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-nb\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.222224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-swift-storage-0\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.222249 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-sb\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.222902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-sb\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.223127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-nb\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.223507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-svc\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.223770 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlprp\" (UniqueName: \"kubernetes.io/projected/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-kube-api-access-xlprp\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.224824 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-swift-storage-0\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.250342 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.252226 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.253449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlprp\" (UniqueName: \"kubernetes.io/projected/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-kube-api-access-xlprp\") pod \"dnsmasq-dns-674b76c99f-chvs9\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.272610 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.313811 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.317108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.324963 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data-custom\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.325026 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c260216b-1400-490d-89f1-69b3ff76e223-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.325049 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c260216b-1400-490d-89f1-69b3ff76e223-logs\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.325067 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.325084 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h8qn\" (UniqueName: \"kubernetes.io/projected/c260216b-1400-490d-89f1-69b3ff76e223-kube-api-access-6h8qn\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.325105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.325136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-scripts\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.387820 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.428019 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c260216b-1400-490d-89f1-69b3ff76e223-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.428550 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c260216b-1400-490d-89f1-69b3ff76e223-logs\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.428685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.429186 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h8qn\" (UniqueName: \"kubernetes.io/projected/c260216b-1400-490d-89f1-69b3ff76e223-kube-api-access-6h8qn\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.429316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.429483 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-scripts\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.429678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c260216b-1400-490d-89f1-69b3ff76e223-logs\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.428351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c260216b-1400-490d-89f1-69b3ff76e223-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.430694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data-custom\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.438203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data-custom\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.438934 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-scripts\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.439099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.446057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h8qn\" (UniqueName: \"kubernetes.io/projected/c260216b-1400-490d-89f1-69b3ff76e223-kube-api-access-6h8qn\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.459908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.564958 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.776967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64dbf95879-s4jqv" event={"ID":"d42a5890-7561-4b99-9518-0c6c672217d9","Type":"ContainerStarted","Data":"d3248dc7436fc843f10693a39a151a35e10c1d8b82ad67fe5a8d4f121bed3dee"} Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.827206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b5874f78-zwqfr" event={"ID":"2a827d62-4407-44f4-afc4-b21c06888c13","Type":"ContainerStarted","Data":"b6eff1054a07da3ed81401d43598253c43816d01fc221775a0bc1ae5456b3053"} Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.827475 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b5874f78-zwqfr" event={"ID":"2a827d62-4407-44f4-afc4-b21c06888c13","Type":"ContainerStarted","Data":"037a12ddd13f19e5e315c4fbd00959719382628576b53581bbf5564c8e7011c0"} Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.844258 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" event={"ID":"e52b1001-3fb0-415b-be6a-e55a548462ac","Type":"ContainerStarted","Data":"e254182d1ace1e9e0ef6baef24cb48ea9df059fe6607ffa52f9cfbf44827c614"} Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.849603 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.850951 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cbdae00-eb75-411e-8f1f-9d6a05d64628" containerID="385f0aafd82f037eba6ca4859ac1e61145af37e64a69a457e0f62d084063d076" exitCode=0 Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.851003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" event={"ID":"1cbdae00-eb75-411e-8f1f-9d6a05d64628","Type":"ContainerDied","Data":"385f0aafd82f037eba6ca4859ac1e61145af37e64a69a457e0f62d084063d076"} Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.851041 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" event={"ID":"1cbdae00-eb75-411e-8f1f-9d6a05d64628","Type":"ContainerStarted","Data":"524ca21e3871c628f07f62036353c67d5390e6c46d7260c64e0bd76c9a5c5ec5"} Oct 14 13:18:49 crc kubenswrapper[4837]: I1014 13:18:49.925771 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-674b76c99f-chvs9"] Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.172663 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:50 crc kubenswrapper[4837]: W1014 13:18:50.178485 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc260216b_1400_490d_89f1_69b3ff76e223.slice/crio-0fd28f09041afa751f24ee49f527c6f4c3a39c5397fa5935136a22f6acf0f461 WatchSource:0}: Error finding container 0fd28f09041afa751f24ee49f527c6f4c3a39c5397fa5935136a22f6acf0f461: Status 404 returned error can't find the container with id 0fd28f09041afa751f24ee49f527c6f4c3a39c5397fa5935136a22f6acf0f461 Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.198594 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.290425 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-nb\") pod \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.290500 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-sb\") pod \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.290532 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-svc\") pod \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.290606 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-swift-storage-0\") pod \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.290661 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-config\") pod \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.290693 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6cm7\" (UniqueName: \"kubernetes.io/projected/1cbdae00-eb75-411e-8f1f-9d6a05d64628-kube-api-access-d6cm7\") pod \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\" (UID: \"1cbdae00-eb75-411e-8f1f-9d6a05d64628\") " Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.309333 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbdae00-eb75-411e-8f1f-9d6a05d64628-kube-api-access-d6cm7" (OuterVolumeSpecName: "kube-api-access-d6cm7") pod "1cbdae00-eb75-411e-8f1f-9d6a05d64628" (UID: "1cbdae00-eb75-411e-8f1f-9d6a05d64628"). InnerVolumeSpecName "kube-api-access-d6cm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.329702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-config" (OuterVolumeSpecName: "config") pod "1cbdae00-eb75-411e-8f1f-9d6a05d64628" (UID: "1cbdae00-eb75-411e-8f1f-9d6a05d64628"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.337943 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cbdae00-eb75-411e-8f1f-9d6a05d64628" (UID: "1cbdae00-eb75-411e-8f1f-9d6a05d64628"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.351234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cbdae00-eb75-411e-8f1f-9d6a05d64628" (UID: "1cbdae00-eb75-411e-8f1f-9d6a05d64628"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.352686 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cbdae00-eb75-411e-8f1f-9d6a05d64628" (UID: "1cbdae00-eb75-411e-8f1f-9d6a05d64628"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.364247 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1cbdae00-eb75-411e-8f1f-9d6a05d64628" (UID: "1cbdae00-eb75-411e-8f1f-9d6a05d64628"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.392911 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.392963 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.392976 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6cm7\" (UniqueName: \"kubernetes.io/projected/1cbdae00-eb75-411e-8f1f-9d6a05d64628-kube-api-access-d6cm7\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.392989 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.393001 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.393013 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cbdae00-eb75-411e-8f1f-9d6a05d64628-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.870402 4837 generic.go:334] "Generic (PLEG): container finished" podID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerID="60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98" exitCode=0 Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.879605 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fc0917b-b4a9-4941-9dc1-199cb056d851","Type":"ContainerStarted","Data":"88b820aedc4e63d4c772a219f8f167678cad215ad9532dfb521b4d15ec83c2e7"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892817 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892834 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" event={"ID":"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b","Type":"ContainerDied","Data":"60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892846 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" event={"ID":"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b","Type":"ContainerStarted","Data":"ba6f3c944219a97e20c32e4ea78642ab23355ef50d86ffe4802b98741df81fd9"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892857 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b5874f78-zwqfr" event={"ID":"2a827d62-4407-44f4-afc4-b21c06888c13","Type":"ContainerStarted","Data":"e477eea2de62523b2cd64affc2214b6e787a2f94667d7f3f71d3d47bc3bcfc35"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892876 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c260216b-1400-490d-89f1-69b3ff76e223","Type":"ContainerStarted","Data":"0fd28f09041afa751f24ee49f527c6f4c3a39c5397fa5935136a22f6acf0f461"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892908 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-tpfp2" event={"ID":"1cbdae00-eb75-411e-8f1f-9d6a05d64628","Type":"ContainerDied","Data":"524ca21e3871c628f07f62036353c67d5390e6c46d7260c64e0bd76c9a5c5ec5"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.892921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerStarted","Data":"ab87507ffe9280dbefb00787352ba7e2cba8685d5133ab1228bcf833c3f25df9"} Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.893578 4837 scope.go:117] "RemoveContainer" containerID="385f0aafd82f037eba6ca4859ac1e61145af37e64a69a457e0f62d084063d076" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.914076 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.05661281 podStartE2EDuration="7.914056344s" podCreationTimestamp="2025-10-14 13:18:43 +0000 UTC" firstStartedPulling="2025-10-14 13:18:44.512458187 +0000 UTC m=+1062.429457990" lastFinishedPulling="2025-10-14 13:18:50.369901711 +0000 UTC m=+1068.286901524" observedRunningTime="2025-10-14 13:18:50.913440898 +0000 UTC m=+1068.830440711" watchObservedRunningTime="2025-10-14 13:18:50.914056344 +0000 UTC m=+1068.831056157" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.937949 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56b5874f78-zwqfr" podStartSLOduration=2.937930452 podStartE2EDuration="2.937930452s" podCreationTimestamp="2025-10-14 13:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:50.936416362 +0000 UTC m=+1068.853416195" watchObservedRunningTime="2025-10-14 13:18:50.937930452 +0000 UTC m=+1068.854930265" Oct 14 13:18:50 crc kubenswrapper[4837]: I1014 13:18:50.993027 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-tpfp2"] Oct 14 13:18:51 crc kubenswrapper[4837]: I1014 13:18:51.007905 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-tpfp2"] Oct 14 13:18:51 crc kubenswrapper[4837]: I1014 13:18:51.932452 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c260216b-1400-490d-89f1-69b3ff76e223","Type":"ContainerStarted","Data":"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.687610 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.803844 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cbdae00-eb75-411e-8f1f-9d6a05d64628" path="/var/lib/kubelet/pods/1cbdae00-eb75-411e-8f1f-9d6a05d64628/volumes" Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.946902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" event={"ID":"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b","Type":"ContainerStarted","Data":"1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.947940 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.951576 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c260216b-1400-490d-89f1-69b3ff76e223","Type":"ContainerStarted","Data":"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.952248 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.956902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" event={"ID":"e52b1001-3fb0-415b-be6a-e55a548462ac","Type":"ContainerStarted","Data":"0181096aee77a5be7b9c63e18c9fd465182befee78db837f07fb2ae908b38626"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.956960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" event={"ID":"e52b1001-3fb0-415b-be6a-e55a548462ac","Type":"ContainerStarted","Data":"a9e434d878e96cf2af18cf95fe2ff1fdd4e01018d581e5745a28114358708d16"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.959524 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fc0917b-b4a9-4941-9dc1-199cb056d851","Type":"ContainerStarted","Data":"396e2f9f257d64ed6dc19083af246afc9c8c9466320c103ae8107a805ed80193"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.968975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64dbf95879-s4jqv" event={"ID":"d42a5890-7561-4b99-9518-0c6c672217d9","Type":"ContainerStarted","Data":"f0ea05b6d432162dc30c3bce6feb79c857384833f8cf9f3380fff8a968e0711f"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.969015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64dbf95879-s4jqv" event={"ID":"d42a5890-7561-4b99-9518-0c6c672217d9","Type":"ContainerStarted","Data":"12d93dab9b37eb965281b5a503073fd200fbad1aed719891a5eb944bb781c52b"} Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.971762 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" podStartSLOduration=4.971745364 podStartE2EDuration="4.971745364s" podCreationTimestamp="2025-10-14 13:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:52.965784784 +0000 UTC m=+1070.882784587" watchObservedRunningTime="2025-10-14 13:18:52.971745364 +0000 UTC m=+1070.888745177" Oct 14 13:18:52 crc kubenswrapper[4837]: I1014 13:18:52.989174 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5dd7b6957d-hqts4" podStartSLOduration=3.303080161 podStartE2EDuration="5.989141319s" podCreationTimestamp="2025-10-14 13:18:47 +0000 UTC" firstStartedPulling="2025-10-14 13:18:48.970238621 +0000 UTC m=+1066.887238434" lastFinishedPulling="2025-10-14 13:18:51.656299779 +0000 UTC m=+1069.573299592" observedRunningTime="2025-10-14 13:18:52.983405105 +0000 UTC m=+1070.900404928" watchObservedRunningTime="2025-10-14 13:18:52.989141319 +0000 UTC m=+1070.906141122" Oct 14 13:18:53 crc kubenswrapper[4837]: I1014 13:18:53.011776 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.011756945 podStartE2EDuration="4.011756945s" podCreationTimestamp="2025-10-14 13:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:53.007651645 +0000 UTC m=+1070.924651458" watchObservedRunningTime="2025-10-14 13:18:53.011756945 +0000 UTC m=+1070.928756758" Oct 14 13:18:53 crc kubenswrapper[4837]: I1014 13:18:53.033605 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-64dbf95879-s4jqv" podStartSLOduration=3.181314804 podStartE2EDuration="6.033584229s" podCreationTimestamp="2025-10-14 13:18:47 +0000 UTC" firstStartedPulling="2025-10-14 13:18:48.804021873 +0000 UTC m=+1066.721021686" lastFinishedPulling="2025-10-14 13:18:51.656291298 +0000 UTC m=+1069.573291111" observedRunningTime="2025-10-14 13:18:53.028783571 +0000 UTC m=+1070.945783384" watchObservedRunningTime="2025-10-14 13:18:53.033584229 +0000 UTC m=+1070.950584062" Oct 14 13:18:53 crc kubenswrapper[4837]: I1014 13:18:53.983315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fc0917b-b4a9-4941-9dc1-199cb056d851","Type":"ContainerStarted","Data":"6951ea9fea4755588a597f68223b98419c708c016dadcee46d48e489d1d3c6e3"} Oct 14 13:18:53 crc kubenswrapper[4837]: I1014 13:18:53.984593 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api-log" containerID="cri-o://249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa" gracePeriod=30 Oct 14 13:18:53 crc kubenswrapper[4837]: I1014 13:18:53.984651 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api" containerID="cri-o://136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0" gracePeriod=30 Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.014562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.212471795 podStartE2EDuration="6.014545713s" podCreationTimestamp="2025-10-14 13:18:48 +0000 UTC" firstStartedPulling="2025-10-14 13:18:49.860766485 +0000 UTC m=+1067.777766298" lastFinishedPulling="2025-10-14 13:18:51.662840403 +0000 UTC m=+1069.579840216" observedRunningTime="2025-10-14 13:18:54.00810533 +0000 UTC m=+1071.925105143" watchObservedRunningTime="2025-10-14 13:18:54.014545713 +0000 UTC m=+1071.931545526" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.251054 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6655497d8d-h2r8r" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.318405 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.598986 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5799b74b9d-p594h"] Oct 14 13:18:54 crc kubenswrapper[4837]: E1014 13:18:54.599800 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbdae00-eb75-411e-8f1f-9d6a05d64628" containerName="init" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.599825 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbdae00-eb75-411e-8f1f-9d6a05d64628" containerName="init" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.600057 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbdae00-eb75-411e-8f1f-9d6a05d64628" containerName="init" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.601964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.604478 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.606456 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.613435 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5799b74b9d-p594h"] Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.651725 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.693669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data-custom\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.693841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h8qn\" (UniqueName: \"kubernetes.io/projected/c260216b-1400-490d-89f1-69b3ff76e223-kube-api-access-6h8qn\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.693878 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-combined-ca-bundle\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.693967 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-scripts\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694013 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c260216b-1400-490d-89f1-69b3ff76e223-logs\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694070 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c260216b-1400-490d-89f1-69b3ff76e223-etc-machine-id\") pod \"c260216b-1400-490d-89f1-69b3ff76e223\" (UID: \"c260216b-1400-490d-89f1-69b3ff76e223\") " Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-internal-tls-certs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694375 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-combined-ca-bundle\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694396 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fd5\" (UniqueName: \"kubernetes.io/projected/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-kube-api-access-q8fd5\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694413 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-public-tls-certs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.694444 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-logs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.695740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-config-data-custom\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.695800 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-config-data\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.699375 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c260216b-1400-490d-89f1-69b3ff76e223-logs" (OuterVolumeSpecName: "logs") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.732235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c260216b-1400-490d-89f1-69b3ff76e223-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.732355 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.732898 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c260216b-1400-490d-89f1-69b3ff76e223-kube-api-access-6h8qn" (OuterVolumeSpecName: "kube-api-access-6h8qn") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "kube-api-access-6h8qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.732994 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-scripts" (OuterVolumeSpecName: "scripts") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.765093 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.792257 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data" (OuterVolumeSpecName: "config-data") pod "c260216b-1400-490d-89f1-69b3ff76e223" (UID: "c260216b-1400-490d-89f1-69b3ff76e223"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.799796 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-logs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.799899 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-config-data-custom\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.799951 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-config-data\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800004 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-internal-tls-certs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-combined-ca-bundle\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800236 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fd5\" (UniqueName: \"kubernetes.io/projected/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-kube-api-access-q8fd5\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800294 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-public-tls-certs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800403 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800451 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800464 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c260216b-1400-490d-89f1-69b3ff76e223-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800477 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c260216b-1400-490d-89f1-69b3ff76e223-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800492 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800537 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h8qn\" (UniqueName: \"kubernetes.io/projected/c260216b-1400-490d-89f1-69b3ff76e223-kube-api-access-6h8qn\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.800550 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c260216b-1400-490d-89f1-69b3ff76e223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.802248 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-logs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.806474 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-config-data\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.807093 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-public-tls-certs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.807755 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-internal-tls-certs\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.811599 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-config-data-custom\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.822064 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-combined-ca-bundle\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.824557 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fd5\" (UniqueName: \"kubernetes.io/projected/7cb7fa99-fe9e-4e56-a3ef-26c6ad271530-kube-api-access-q8fd5\") pod \"barbican-api-5799b74b9d-p594h\" (UID: \"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530\") " pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.964599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995786 4837 generic.go:334] "Generic (PLEG): container finished" podID="c260216b-1400-490d-89f1-69b3ff76e223" containerID="136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0" exitCode=0 Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995825 4837 generic.go:334] "Generic (PLEG): container finished" podID="c260216b-1400-490d-89f1-69b3ff76e223" containerID="249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa" exitCode=143 Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995829 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c260216b-1400-490d-89f1-69b3ff76e223","Type":"ContainerDied","Data":"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0"} Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995851 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995874 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c260216b-1400-490d-89f1-69b3ff76e223","Type":"ContainerDied","Data":"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa"} Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995889 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c260216b-1400-490d-89f1-69b3ff76e223","Type":"ContainerDied","Data":"0fd28f09041afa751f24ee49f527c6f4c3a39c5397fa5935136a22f6acf0f461"} Oct 14 13:18:54 crc kubenswrapper[4837]: I1014 13:18:54.995909 4837 scope.go:117] "RemoveContainer" containerID="136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.033589 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.047396 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.058233 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:55 crc kubenswrapper[4837]: E1014 13:18:55.058790 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.058818 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api" Oct 14 13:18:55 crc kubenswrapper[4837]: E1014 13:18:55.058855 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api-log" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.058864 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api-log" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.059076 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api-log" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.059111 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c260216b-1400-490d-89f1-69b3ff76e223" containerName="cinder-api" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.060401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.062830 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.063313 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.064202 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.083420 4837 scope.go:117] "RemoveContainer" containerID="249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.105537 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9c4x\" (UniqueName: \"kubernetes.io/projected/92e412ce-d61d-4c7f-8297-ce2cc5011325-kube-api-access-w9c4x\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.105844 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.105864 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-public-tls-certs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.105884 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-scripts\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.105916 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e412ce-d61d-4c7f-8297-ce2cc5011325-logs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.106014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92e412ce-d61d-4c7f-8297-ce2cc5011325-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.106034 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-config-data\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.106230 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.106272 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-config-data-custom\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.108385 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.182641 4837 scope.go:117] "RemoveContainer" containerID="136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0" Oct 14 13:18:55 crc kubenswrapper[4837]: E1014 13:18:55.183541 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0\": container with ID starting with 136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0 not found: ID does not exist" containerID="136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.183602 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0"} err="failed to get container status \"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0\": rpc error: code = NotFound desc = could not find container \"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0\": container with ID starting with 136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0 not found: ID does not exist" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.183628 4837 scope.go:117] "RemoveContainer" containerID="249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa" Oct 14 13:18:55 crc kubenswrapper[4837]: E1014 13:18:55.184040 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa\": container with ID starting with 249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa not found: ID does not exist" containerID="249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.184080 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa"} err="failed to get container status \"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa\": rpc error: code = NotFound desc = could not find container \"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa\": container with ID starting with 249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa not found: ID does not exist" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.184097 4837 scope.go:117] "RemoveContainer" containerID="136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.185396 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0"} err="failed to get container status \"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0\": rpc error: code = NotFound desc = could not find container \"136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0\": container with ID starting with 136901eb61792f152aca6977dfeb4c8279fb0e6416c8120f4aced9fa56efdee0 not found: ID does not exist" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.185413 4837 scope.go:117] "RemoveContainer" containerID="249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.186209 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa"} err="failed to get container status \"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa\": rpc error: code = NotFound desc = could not find container \"249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa\": container with ID starting with 249386da1cb0309101f5cd5d964df5e7abb583c198547206e7d36f49739271aa not found: ID does not exist" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.207831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.207867 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-config-data-custom\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.207922 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9c4x\" (UniqueName: \"kubernetes.io/projected/92e412ce-d61d-4c7f-8297-ce2cc5011325-kube-api-access-w9c4x\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.207968 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.207985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-public-tls-certs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.208006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-scripts\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.208032 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e412ce-d61d-4c7f-8297-ce2cc5011325-logs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.208063 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92e412ce-d61d-4c7f-8297-ce2cc5011325-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.208077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-config-data\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.213187 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92e412ce-d61d-4c7f-8297-ce2cc5011325-logs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.213461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.213609 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92e412ce-d61d-4c7f-8297-ce2cc5011325-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.219663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.223377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-public-tls-certs\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.224596 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-scripts\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.225892 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-config-data\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.233069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9c4x\" (UniqueName: \"kubernetes.io/projected/92e412ce-d61d-4c7f-8297-ce2cc5011325-kube-api-access-w9c4x\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.236032 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e412ce-d61d-4c7f-8297-ce2cc5011325-config-data-custom\") pod \"cinder-api-0\" (UID: \"92e412ce-d61d-4c7f-8297-ce2cc5011325\") " pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.449043 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.461969 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5799b74b9d-p594h"] Oct 14 13:18:55 crc kubenswrapper[4837]: I1014 13:18:55.877108 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 14 13:18:55 crc kubenswrapper[4837]: W1014 13:18:55.885327 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92e412ce_d61d_4c7f_8297_ce2cc5011325.slice/crio-2c31f7d1c519f67d27293dfed1b2bd058436ae3f0ba0ca89d40cb45b81666ed3 WatchSource:0}: Error finding container 2c31f7d1c519f67d27293dfed1b2bd058436ae3f0ba0ca89d40cb45b81666ed3: Status 404 returned error can't find the container with id 2c31f7d1c519f67d27293dfed1b2bd058436ae3f0ba0ca89d40cb45b81666ed3 Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.006891 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5799b74b9d-p594h" event={"ID":"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530","Type":"ContainerStarted","Data":"653aae495c69fc548e3eed2b76b5d03a80c276479cafa80e081a8d1163069251"} Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.006935 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5799b74b9d-p594h" event={"ID":"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530","Type":"ContainerStarted","Data":"94b3ff10a5d9e27a97d219211b394e846dd23c61cbb06d6f336865f0c501ca0f"} Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.006949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5799b74b9d-p594h" event={"ID":"7cb7fa99-fe9e-4e56-a3ef-26c6ad271530","Type":"ContainerStarted","Data":"a91454ef2fcd01f01551bb29d9bf6d0ef3900b4c666ef3014b968697bbd12975"} Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.006986 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.010869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92e412ce-d61d-4c7f-8297-ce2cc5011325","Type":"ContainerStarted","Data":"2c31f7d1c519f67d27293dfed1b2bd058436ae3f0ba0ca89d40cb45b81666ed3"} Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.012659 4837 generic.go:334] "Generic (PLEG): container finished" podID="3dc6adfa-9f60-4e67-ba33-98badd63dd5f" containerID="755881917e433eb82b4d7761844beaad423e0a39296429067ee6da8049a61511" exitCode=0 Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.012745 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mwvdq" event={"ID":"3dc6adfa-9f60-4e67-ba33-98badd63dd5f","Type":"ContainerDied","Data":"755881917e433eb82b4d7761844beaad423e0a39296429067ee6da8049a61511"} Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.041951 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5799b74b9d-p594h" podStartSLOduration=2.041933431 podStartE2EDuration="2.041933431s" podCreationTimestamp="2025-10-14 13:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:56.02989501 +0000 UTC m=+1073.946894823" watchObservedRunningTime="2025-10-14 13:18:56.041933431 +0000 UTC m=+1073.958933244" Oct 14 13:18:56 crc kubenswrapper[4837]: I1014 13:18:56.800748 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c260216b-1400-490d-89f1-69b3ff76e223" path="/var/lib/kubelet/pods/c260216b-1400-490d-89f1-69b3ff76e223/volumes" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.039080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92e412ce-d61d-4c7f-8297-ce2cc5011325","Type":"ContainerStarted","Data":"f58b5e684215c97a900619161dfba16d7c305a8a7f5660cd25d514cb1c38a45e"} Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.040025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"92e412ce-d61d-4c7f-8297-ce2cc5011325","Type":"ContainerStarted","Data":"151b87b1adcfaab3c23b71c27635c7e010414470fc889d2a15ae0594ccac1e91"} Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.040195 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.040244 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.089590 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.089564729 podStartE2EDuration="2.089564729s" podCreationTimestamp="2025-10-14 13:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:18:57.072108942 +0000 UTC m=+1074.989108775" watchObservedRunningTime="2025-10-14 13:18:57.089564729 +0000 UTC m=+1075.006564552" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.401049 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.472933 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-combined-ca-bundle\") pod \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.473252 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-config\") pod \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.473404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn4zv\" (UniqueName: \"kubernetes.io/projected/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-kube-api-access-hn4zv\") pod \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\" (UID: \"3dc6adfa-9f60-4e67-ba33-98badd63dd5f\") " Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.476870 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-kube-api-access-hn4zv" (OuterVolumeSpecName: "kube-api-access-hn4zv") pod "3dc6adfa-9f60-4e67-ba33-98badd63dd5f" (UID: "3dc6adfa-9f60-4e67-ba33-98badd63dd5f"). InnerVolumeSpecName "kube-api-access-hn4zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.497276 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-config" (OuterVolumeSpecName: "config") pod "3dc6adfa-9f60-4e67-ba33-98badd63dd5f" (UID: "3dc6adfa-9f60-4e67-ba33-98badd63dd5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.516011 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dc6adfa-9f60-4e67-ba33-98badd63dd5f" (UID: "3dc6adfa-9f60-4e67-ba33-98badd63dd5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.576438 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.576787 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn4zv\" (UniqueName: \"kubernetes.io/projected/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-kube-api-access-hn4zv\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:57 crc kubenswrapper[4837]: I1014 13:18:57.576921 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc6adfa-9f60-4e67-ba33-98badd63dd5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.052484 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mwvdq" event={"ID":"3dc6adfa-9f60-4e67-ba33-98badd63dd5f","Type":"ContainerDied","Data":"27249ad07076b3a7d7f38d610d4378b967197f686830d0a7e27f6c792ccc97e9"} Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.052809 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27249ad07076b3a7d7f38d610d4378b967197f686830d0a7e27f6c792ccc97e9" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.052712 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mwvdq" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.335126 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-674b76c99f-chvs9"] Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.337468 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerName="dnsmasq-dns" containerID="cri-o://1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1" gracePeriod=10 Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.340332 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.392482 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-w7l7p"] Oct 14 13:18:58 crc kubenswrapper[4837]: E1014 13:18:58.392906 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc6adfa-9f60-4e67-ba33-98badd63dd5f" containerName="neutron-db-sync" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.392923 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc6adfa-9f60-4e67-ba33-98badd63dd5f" containerName="neutron-db-sync" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.393139 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc6adfa-9f60-4e67-ba33-98badd63dd5f" containerName="neutron-db-sync" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.394109 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.426108 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-w7l7p"] Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.499299 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.499366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.499466 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.499508 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-config\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.499552 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.499602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prks\" (UniqueName: \"kubernetes.io/projected/8773b825-0f73-4a74-9d59-522f75f7b425-kube-api-access-9prks\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.548817 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db89896b8-g9kwv"] Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.552298 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.577661 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.578063 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.579360 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hs978" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.579933 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.597422 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db89896b8-g9kwv"] Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602757 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ktn\" (UniqueName: \"kubernetes.io/projected/67361ebb-8351-4449-a97f-26f22d4263cc-kube-api-access-b9ktn\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-httpd-config\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602928 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-ovndb-tls-certs\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602955 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.602987 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-config\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.603027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.603060 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9prks\" (UniqueName: \"kubernetes.io/projected/8773b825-0f73-4a74-9d59-522f75f7b425-kube-api-access-9prks\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.603085 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-combined-ca-bundle\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.603131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-config\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.609065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.610393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.610908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-config\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.623941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.627987 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.663018 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prks\" (UniqueName: \"kubernetes.io/projected/8773b825-0f73-4a74-9d59-522f75f7b425-kube-api-access-9prks\") pod \"dnsmasq-dns-6bb4fc677f-w7l7p\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.726895 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-combined-ca-bundle\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.726966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-config\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.727049 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ktn\" (UniqueName: \"kubernetes.io/projected/67361ebb-8351-4449-a97f-26f22d4263cc-kube-api-access-b9ktn\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.727139 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-httpd-config\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.727185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-ovndb-tls-certs\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.739857 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-combined-ca-bundle\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.753222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-ovndb-tls-certs\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.756618 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.766927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-httpd-config\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.767743 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-config\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.770994 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ktn\" (UniqueName: \"kubernetes.io/projected/67361ebb-8351-4449-a97f-26f22d4263cc-kube-api-access-b9ktn\") pod \"neutron-db89896b8-g9kwv\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:58 crc kubenswrapper[4837]: I1014 13:18:58.892646 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.017521 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.083922 4837 generic.go:334] "Generic (PLEG): container finished" podID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerID="1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1" exitCode=0 Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.083958 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" event={"ID":"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b","Type":"ContainerDied","Data":"1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1"} Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.083982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" event={"ID":"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b","Type":"ContainerDied","Data":"ba6f3c944219a97e20c32e4ea78642ab23355ef50d86ffe4802b98741df81fd9"} Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.083999 4837 scope.go:117] "RemoveContainer" containerID="1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.084136 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674b76c99f-chvs9" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.121008 4837 scope.go:117] "RemoveContainer" containerID="60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.139454 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-nb\") pod \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.139495 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-svc\") pod \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.140940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-config\") pod \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.141024 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlprp\" (UniqueName: \"kubernetes.io/projected/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-kube-api-access-xlprp\") pod \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.141103 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-sb\") pod \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.141126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-swift-storage-0\") pod \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\" (UID: \"5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b\") " Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.174145 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-kube-api-access-xlprp" (OuterVolumeSpecName: "kube-api-access-xlprp") pod "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" (UID: "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b"). InnerVolumeSpecName "kube-api-access-xlprp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.180210 4837 scope.go:117] "RemoveContainer" containerID="1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1" Oct 14 13:18:59 crc kubenswrapper[4837]: E1014 13:18:59.182585 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1\": container with ID starting with 1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1 not found: ID does not exist" containerID="1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.182614 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1"} err="failed to get container status \"1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1\": rpc error: code = NotFound desc = could not find container \"1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1\": container with ID starting with 1614f69a9cd7c0bc67a94e3c8c5376450434f15b97df9c8ef4f457e55dffa9c1 not found: ID does not exist" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.182634 4837 scope.go:117] "RemoveContainer" containerID="60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98" Oct 14 13:18:59 crc kubenswrapper[4837]: E1014 13:18:59.183687 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98\": container with ID starting with 60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98 not found: ID does not exist" containerID="60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.183709 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98"} err="failed to get container status \"60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98\": rpc error: code = NotFound desc = could not find container \"60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98\": container with ID starting with 60ef5a520db2e91f2dbda92916c611abe657b452b53cea12c57e8f8918beef98 not found: ID does not exist" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.230570 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" (UID: "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.239037 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-config" (OuterVolumeSpecName: "config") pod "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" (UID: "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.243322 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.243357 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.243369 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlprp\" (UniqueName: \"kubernetes.io/projected/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-kube-api-access-xlprp\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.253952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" (UID: "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.264591 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" (UID: "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.264931 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" (UID: "5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.344549 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.344585 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.344598 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.392227 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-w7l7p"] Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.448493 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-674b76c99f-chvs9"] Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.459961 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-674b76c99f-chvs9"] Oct 14 13:18:59 crc kubenswrapper[4837]: E1014 13:18:59.618274 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ca2cf0a_a8df_48cc_ace2_a0472ca8e39b.slice/crio-ba6f3c944219a97e20c32e4ea78642ab23355ef50d86ffe4802b98741df81fd9\": RecentStats: unable to find data in memory cache]" Oct 14 13:18:59 crc kubenswrapper[4837]: W1014 13:18:59.624503 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67361ebb_8351_4449_a97f_26f22d4263cc.slice/crio-3e1ce38c541b5c39fd5fa61cf923a29e24e5c1b8ff798d2fd076cbad1e470155 WatchSource:0}: Error finding container 3e1ce38c541b5c39fd5fa61cf923a29e24e5c1b8ff798d2fd076cbad1e470155: Status 404 returned error can't find the container with id 3e1ce38c541b5c39fd5fa61cf923a29e24e5c1b8ff798d2fd076cbad1e470155 Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.628194 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db89896b8-g9kwv"] Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.722394 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 14 13:18:59 crc kubenswrapper[4837]: I1014 13:18:59.787472 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.096448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db89896b8-g9kwv" event={"ID":"67361ebb-8351-4449-a97f-26f22d4263cc","Type":"ContainerStarted","Data":"dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37"} Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.096504 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db89896b8-g9kwv" event={"ID":"67361ebb-8351-4449-a97f-26f22d4263cc","Type":"ContainerStarted","Data":"bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a"} Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.096546 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db89896b8-g9kwv" event={"ID":"67361ebb-8351-4449-a97f-26f22d4263cc","Type":"ContainerStarted","Data":"3e1ce38c541b5c39fd5fa61cf923a29e24e5c1b8ff798d2fd076cbad1e470155"} Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.097927 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.105475 4837 generic.go:334] "Generic (PLEG): container finished" podID="8773b825-0f73-4a74-9d59-522f75f7b425" containerID="267f2011388eaed0d785477edc6c93326ac3f5800a6cb94f5dff31bfbda6b23e" exitCode=0 Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.105729 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="cinder-scheduler" containerID="cri-o://396e2f9f257d64ed6dc19083af246afc9c8c9466320c103ae8107a805ed80193" gracePeriod=30 Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.106017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" event={"ID":"8773b825-0f73-4a74-9d59-522f75f7b425","Type":"ContainerDied","Data":"267f2011388eaed0d785477edc6c93326ac3f5800a6cb94f5dff31bfbda6b23e"} Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.106057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" event={"ID":"8773b825-0f73-4a74-9d59-522f75f7b425","Type":"ContainerStarted","Data":"853f07c4de0e2fe42461c1f73a6184b51dc4c822ed8d84e5141ecace62269a78"} Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.106100 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="probe" containerID="cri-o://6951ea9fea4755588a597f68223b98419c708c016dadcee46d48e489d1d3c6e3" gracePeriod=30 Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.133850 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db89896b8-g9kwv" podStartSLOduration=2.133828454 podStartE2EDuration="2.133828454s" podCreationTimestamp="2025-10-14 13:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:00.128199213 +0000 UTC m=+1078.045199036" watchObservedRunningTime="2025-10-14 13:19:00.133828454 +0000 UTC m=+1078.050828267" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.597733 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.797890 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" path="/var/lib/kubelet/pods/5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b/volumes" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.798605 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f7c5db7df-7tsqg"] Oct 14 13:19:00 crc kubenswrapper[4837]: E1014 13:19:00.798977 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerName="dnsmasq-dns" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.799001 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerName="dnsmasq-dns" Oct 14 13:19:00 crc kubenswrapper[4837]: E1014 13:19:00.799016 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerName="init" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.799025 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerName="init" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.799299 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca2cf0a-a8df-48cc-ace2-a0472ca8e39b" containerName="dnsmasq-dns" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.801491 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f7c5db7df-7tsqg"] Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.801664 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.804783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.820566 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.877751 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-httpd-config\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.877838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-ovndb-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.877902 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l29k\" (UniqueName: \"kubernetes.io/projected/a7d3bc97-ce39-472b-860e-79b620b726f1-kube-api-access-5l29k\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.878019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-config\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.878087 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-internal-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.878368 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-combined-ca-bundle\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.878510 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-public-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l29k\" (UniqueName: \"kubernetes.io/projected/a7d3bc97-ce39-472b-860e-79b620b726f1-kube-api-access-5l29k\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980661 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-config\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-internal-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-combined-ca-bundle\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980814 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-public-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-httpd-config\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.980911 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-ovndb-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.990002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-combined-ca-bundle\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.990148 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-config\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.990657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-internal-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:00 crc kubenswrapper[4837]: I1014 13:19:00.995850 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-httpd-config\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.001257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-public-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.003065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l29k\" (UniqueName: \"kubernetes.io/projected/a7d3bc97-ce39-472b-860e-79b620b726f1-kube-api-access-5l29k\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.003727 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d3bc97-ce39-472b-860e-79b620b726f1-ovndb-tls-certs\") pod \"neutron-6f7c5db7df-7tsqg\" (UID: \"a7d3bc97-ce39-472b-860e-79b620b726f1\") " pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.116776 4837 generic.go:334] "Generic (PLEG): container finished" podID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerID="6951ea9fea4755588a597f68223b98419c708c016dadcee46d48e489d1d3c6e3" exitCode=0 Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.116820 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fc0917b-b4a9-4941-9dc1-199cb056d851","Type":"ContainerDied","Data":"6951ea9fea4755588a597f68223b98419c708c016dadcee46d48e489d1d3c6e3"} Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.119335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" event={"ID":"8773b825-0f73-4a74-9d59-522f75f7b425","Type":"ContainerStarted","Data":"8fc388ef14a51ca6ea0f41bd8aa332ab173480faacd4862f2d31d1bb865c048f"} Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.128114 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.128741 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.141846 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" podStartSLOduration=3.141831041 podStartE2EDuration="3.141831041s" podCreationTimestamp="2025-10-14 13:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:01.138993015 +0000 UTC m=+1079.055992828" watchObservedRunningTime="2025-10-14 13:19:01.141831041 +0000 UTC m=+1079.058830854" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.183691 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bcd589b8f-ljfsq" Oct 14 13:19:01 crc kubenswrapper[4837]: I1014 13:19:01.789450 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f7c5db7df-7tsqg"] Oct 14 13:19:02 crc kubenswrapper[4837]: I1014 13:19:02.134347 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f7c5db7df-7tsqg" event={"ID":"a7d3bc97-ce39-472b-860e-79b620b726f1","Type":"ContainerStarted","Data":"2b15504f80d16ae4494f82360b09a90d40e5857577e418782ce4f8897b97d71b"} Oct 14 13:19:02 crc kubenswrapper[4837]: I1014 13:19:02.134738 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:19:02 crc kubenswrapper[4837]: I1014 13:19:02.134766 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f7c5db7df-7tsqg" event={"ID":"a7d3bc97-ce39-472b-860e-79b620b726f1","Type":"ContainerStarted","Data":"81110492146ea7fd88d8a421394ea01d0af9fb7f3c62a4d1f88b58424252e712"} Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.143220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f7c5db7df-7tsqg" event={"ID":"a7d3bc97-ce39-472b-860e-79b620b726f1","Type":"ContainerStarted","Data":"3c41c92309b5e3621b4762ea7beba8c9816706d4228ac0fd5660265ecd8b0e63"} Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.143715 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.168643 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f7c5db7df-7tsqg" podStartSLOduration=3.168627424 podStartE2EDuration="3.168627424s" podCreationTimestamp="2025-10-14 13:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:03.166141937 +0000 UTC m=+1081.083141770" watchObservedRunningTime="2025-10-14 13:19:03.168627424 +0000 UTC m=+1081.085627227" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.194092 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.195120 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.198950 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vbq4n" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.199561 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.199637 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.205438 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.237016 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75fffdca-61c2-4af0-a87d-1662358aa171-openstack-config\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.237381 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75fffdca-61c2-4af0-a87d-1662358aa171-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.237466 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkh9x\" (UniqueName: \"kubernetes.io/projected/75fffdca-61c2-4af0-a87d-1662358aa171-kube-api-access-zkh9x\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.237881 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75fffdca-61c2-4af0-a87d-1662358aa171-openstack-config-secret\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.339728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75fffdca-61c2-4af0-a87d-1662358aa171-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.339821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkh9x\" (UniqueName: \"kubernetes.io/projected/75fffdca-61c2-4af0-a87d-1662358aa171-kube-api-access-zkh9x\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.340006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75fffdca-61c2-4af0-a87d-1662358aa171-openstack-config-secret\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.340218 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75fffdca-61c2-4af0-a87d-1662358aa171-openstack-config\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.341061 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75fffdca-61c2-4af0-a87d-1662358aa171-openstack-config\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.351664 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75fffdca-61c2-4af0-a87d-1662358aa171-openstack-config-secret\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.356037 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75fffdca-61c2-4af0-a87d-1662358aa171-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.359282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkh9x\" (UniqueName: \"kubernetes.io/projected/75fffdca-61c2-4af0-a87d-1662358aa171-kube-api-access-zkh9x\") pod \"openstackclient\" (UID: \"75fffdca-61c2-4af0-a87d-1662358aa171\") " pod="openstack/openstackclient" Oct 14 13:19:03 crc kubenswrapper[4837]: I1014 13:19:03.513625 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 14 13:19:04 crc kubenswrapper[4837]: I1014 13:19:04.021136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 14 13:19:04 crc kubenswrapper[4837]: I1014 13:19:04.250648 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6655497d8d-h2r8r" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Oct 14 13:19:04 crc kubenswrapper[4837]: I1014 13:19:04.250852 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.162299 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75fffdca-61c2-4af0-a87d-1662358aa171","Type":"ContainerStarted","Data":"0b4cedc83858746066e910c244589660d6d157a76cc9fa281a48402c8e8e3626"} Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.164746 4837 generic.go:334] "Generic (PLEG): container finished" podID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerID="396e2f9f257d64ed6dc19083af246afc9c8c9466320c103ae8107a805ed80193" exitCode=0 Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.164812 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fc0917b-b4a9-4941-9dc1-199cb056d851","Type":"ContainerDied","Data":"396e2f9f257d64ed6dc19083af246afc9c8c9466320c103ae8107a805ed80193"} Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.751063 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.798227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data-custom\") pod \"1fc0917b-b4a9-4941-9dc1-199cb056d851\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.798307 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data\") pod \"1fc0917b-b4a9-4941-9dc1-199cb056d851\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.798466 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fc0917b-b4a9-4941-9dc1-199cb056d851-etc-machine-id\") pod \"1fc0917b-b4a9-4941-9dc1-199cb056d851\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.798640 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-combined-ca-bundle\") pod \"1fc0917b-b4a9-4941-9dc1-199cb056d851\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.799301 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg8fl\" (UniqueName: \"kubernetes.io/projected/1fc0917b-b4a9-4941-9dc1-199cb056d851-kube-api-access-gg8fl\") pod \"1fc0917b-b4a9-4941-9dc1-199cb056d851\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.799392 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-scripts\") pod \"1fc0917b-b4a9-4941-9dc1-199cb056d851\" (UID: \"1fc0917b-b4a9-4941-9dc1-199cb056d851\") " Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.799638 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1fc0917b-b4a9-4941-9dc1-199cb056d851-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1fc0917b-b4a9-4941-9dc1-199cb056d851" (UID: "1fc0917b-b4a9-4941-9dc1-199cb056d851"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.800391 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1fc0917b-b4a9-4941-9dc1-199cb056d851-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.822740 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-scripts" (OuterVolumeSpecName: "scripts") pod "1fc0917b-b4a9-4941-9dc1-199cb056d851" (UID: "1fc0917b-b4a9-4941-9dc1-199cb056d851"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.902735 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.971865 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1fc0917b-b4a9-4941-9dc1-199cb056d851" (UID: "1fc0917b-b4a9-4941-9dc1-199cb056d851"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.971935 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data" (OuterVolumeSpecName: "config-data") pod "1fc0917b-b4a9-4941-9dc1-199cb056d851" (UID: "1fc0917b-b4a9-4941-9dc1-199cb056d851"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.977553 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc0917b-b4a9-4941-9dc1-199cb056d851" (UID: "1fc0917b-b4a9-4941-9dc1-199cb056d851"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:05 crc kubenswrapper[4837]: I1014 13:19:05.983559 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc0917b-b4a9-4941-9dc1-199cb056d851-kube-api-access-gg8fl" (OuterVolumeSpecName: "kube-api-access-gg8fl") pod "1fc0917b-b4a9-4941-9dc1-199cb056d851" (UID: "1fc0917b-b4a9-4941-9dc1-199cb056d851"). InnerVolumeSpecName "kube-api-access-gg8fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.005020 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.005067 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.005080 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc0917b-b4a9-4941-9dc1-199cb056d851-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.005092 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg8fl\" (UniqueName: \"kubernetes.io/projected/1fc0917b-b4a9-4941-9dc1-199cb056d851-kube-api-access-gg8fl\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.174724 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1fc0917b-b4a9-4941-9dc1-199cb056d851","Type":"ContainerDied","Data":"88b820aedc4e63d4c772a219f8f167678cad215ad9532dfb521b4d15ec83c2e7"} Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.174787 4837 scope.go:117] "RemoveContainer" containerID="6951ea9fea4755588a597f68223b98419c708c016dadcee46d48e489d1d3c6e3" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.174908 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.203948 4837 scope.go:117] "RemoveContainer" containerID="396e2f9f257d64ed6dc19083af246afc9c8c9466320c103ae8107a805ed80193" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.220290 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.234144 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.258821 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:19:06 crc kubenswrapper[4837]: E1014 13:19:06.259243 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="probe" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.259258 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="probe" Oct 14 13:19:06 crc kubenswrapper[4837]: E1014 13:19:06.259295 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="cinder-scheduler" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.259301 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="cinder-scheduler" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.259465 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="probe" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.259482 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" containerName="cinder-scheduler" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.260388 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.262072 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.281483 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.312005 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.312068 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.312277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45dpp\" (UniqueName: \"kubernetes.io/projected/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-kube-api-access-45dpp\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.312308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.312450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-scripts\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.312540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-config-data\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.414761 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.414830 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.414854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45dpp\" (UniqueName: \"kubernetes.io/projected/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-kube-api-access-45dpp\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.414899 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.414985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-scripts\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.415033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-config-data\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.415488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.497830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.498577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.498659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-scripts\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.499091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45dpp\" (UniqueName: \"kubernetes.io/projected/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-kube-api-access-45dpp\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.499672 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0d5e89-66d3-4f22-9704-7c3c35ee537f-config-data\") pod \"cinder-scheduler-0\" (UID: \"aa0d5e89-66d3-4f22-9704-7c3c35ee537f\") " pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.629942 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.794459 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.798043 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc0917b-b4a9-4941-9dc1-199cb056d851" path="/var/lib/kubelet/pods/1fc0917b-b4a9-4941-9dc1-199cb056d851/volumes" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.799031 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5799b74b9d-p594h" Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.897568 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56b5874f78-zwqfr"] Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.902683 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56b5874f78-zwqfr" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api" containerID="cri-o://e477eea2de62523b2cd64affc2214b6e787a2f94667d7f3f71d3d47bc3bcfc35" gracePeriod=30 Oct 14 13:19:06 crc kubenswrapper[4837]: I1014 13:19:06.902980 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56b5874f78-zwqfr" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api-log" containerID="cri-o://b6eff1054a07da3ed81401d43598253c43816d01fc221775a0bc1ae5456b3053" gracePeriod=30 Oct 14 13:19:07 crc kubenswrapper[4837]: I1014 13:19:07.199367 4837 generic.go:334] "Generic (PLEG): container finished" podID="2a827d62-4407-44f4-afc4-b21c06888c13" containerID="b6eff1054a07da3ed81401d43598253c43816d01fc221775a0bc1ae5456b3053" exitCode=143 Oct 14 13:19:07 crc kubenswrapper[4837]: I1014 13:19:07.199691 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b5874f78-zwqfr" event={"ID":"2a827d62-4407-44f4-afc4-b21c06888c13","Type":"ContainerDied","Data":"b6eff1054a07da3ed81401d43598253c43816d01fc221775a0bc1ae5456b3053"} Oct 14 13:19:07 crc kubenswrapper[4837]: I1014 13:19:07.396252 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 14 13:19:07 crc kubenswrapper[4837]: I1014 13:19:07.443066 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 14 13:19:08 crc kubenswrapper[4837]: I1014 13:19:08.220539 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa0d5e89-66d3-4f22-9704-7c3c35ee537f","Type":"ContainerStarted","Data":"eccec996e23b9062ad8670464430168fe0d0aebc02dd41a312ce4992b68d5a3d"} Oct 14 13:19:08 crc kubenswrapper[4837]: I1014 13:19:08.758316 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:19:08 crc kubenswrapper[4837]: I1014 13:19:08.834743 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vfpnl"] Oct 14 13:19:08 crc kubenswrapper[4837]: I1014 13:19:08.834988 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="dnsmasq-dns" containerID="cri-o://9d08ef2e1c90280b6a0b75b4ae79f5bd1ce77f65ce9892790425ec06439d51b6" gracePeriod=10 Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.246329 4837 generic.go:334] "Generic (PLEG): container finished" podID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerID="b49e994a7536096b426fc73f5d5665d1e524c08924c7a054563890c3689310a7" exitCode=137 Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.246389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6655497d8d-h2r8r" event={"ID":"2372174a-82ac-4421-a4e7-8ffcd7b4e92f","Type":"ContainerDied","Data":"b49e994a7536096b426fc73f5d5665d1e524c08924c7a054563890c3689310a7"} Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.251834 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa0d5e89-66d3-4f22-9704-7c3c35ee537f","Type":"ContainerStarted","Data":"14c3419ecb01774d662f40ee4657ab7a41ceb47eead92c7f07bfdd73d5f92deb"} Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.675364 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788497 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-config-data\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788580 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-combined-ca-bundle\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788606 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjtth\" (UniqueName: \"kubernetes.io/projected/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-kube-api-access-mjtth\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788672 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-secret-key\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788810 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-logs\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788861 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-scripts\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.788910 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-tls-certs\") pod \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\" (UID: \"2372174a-82ac-4421-a4e7-8ffcd7b4e92f\") " Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.789566 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-logs" (OuterVolumeSpecName: "logs") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.789822 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.820478 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.820805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-kube-api-access-mjtth" (OuterVolumeSpecName: "kube-api-access-mjtth") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "kube-api-access-mjtth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.842745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.854847 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-config-data" (OuterVolumeSpecName: "config-data") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.856511 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-scripts" (OuterVolumeSpecName: "scripts") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.858562 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "2372174a-82ac-4421-a4e7-8ffcd7b4e92f" (UID: "2372174a-82ac-4421-a4e7-8ffcd7b4e92f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.891760 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.891801 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.891846 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.891858 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.891872 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjtth\" (UniqueName: \"kubernetes.io/projected/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-kube-api-access-mjtth\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:09 crc kubenswrapper[4837]: I1014 13:19:09.892782 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2372174a-82ac-4421-a4e7-8ffcd7b4e92f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.079746 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56b5874f78-zwqfr" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:51692->10.217.0.160:9311: read: connection reset by peer" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.079873 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56b5874f78-zwqfr" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:51702->10.217.0.160:9311: read: connection reset by peer" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.270151 4837 generic.go:334] "Generic (PLEG): container finished" podID="2a827d62-4407-44f4-afc4-b21c06888c13" containerID="e477eea2de62523b2cd64affc2214b6e787a2f94667d7f3f71d3d47bc3bcfc35" exitCode=0 Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.270193 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b5874f78-zwqfr" event={"ID":"2a827d62-4407-44f4-afc4-b21c06888c13","Type":"ContainerDied","Data":"e477eea2de62523b2cd64affc2214b6e787a2f94667d7f3f71d3d47bc3bcfc35"} Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.272766 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6655497d8d-h2r8r" event={"ID":"2372174a-82ac-4421-a4e7-8ffcd7b4e92f","Type":"ContainerDied","Data":"5beb8d9c77a85be189ef4cdb1f7c88ef94c285be00447b49a472d62ee70c4b74"} Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.272809 4837 scope.go:117] "RemoveContainer" containerID="ce844888499b913841277f9d1a2c07dd86967378d2867cfebad77ec15be8e2c0" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.272912 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6655497d8d-h2r8r" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.280393 4837 generic.go:334] "Generic (PLEG): container finished" podID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerID="9d08ef2e1c90280b6a0b75b4ae79f5bd1ce77f65ce9892790425ec06439d51b6" exitCode=0 Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.280511 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" event={"ID":"6c469b81-89a4-4d35-bc9a-b04b82c2571e","Type":"ContainerDied","Data":"9d08ef2e1c90280b6a0b75b4ae79f5bd1ce77f65ce9892790425ec06439d51b6"} Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.283638 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa0d5e89-66d3-4f22-9704-7c3c35ee537f","Type":"ContainerStarted","Data":"f746ba41f11f6564e7da309562c6bc3e42436c568e6ab78b1c2d459d7101eed7"} Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.315407 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.315389043 podStartE2EDuration="4.315389043s" podCreationTimestamp="2025-10-14 13:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:10.302567211 +0000 UTC m=+1088.219567024" watchObservedRunningTime="2025-10-14 13:19:10.315389043 +0000 UTC m=+1088.232388866" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.338410 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6655497d8d-h2r8r"] Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.345191 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6655497d8d-h2r8r"] Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.472134 4837 scope.go:117] "RemoveContainer" containerID="b49e994a7536096b426fc73f5d5665d1e524c08924c7a054563890c3689310a7" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.599357 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.681430 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.709880 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-svc\") pod \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.710182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-sb\") pod \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.710390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-swift-storage-0\") pod \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.710696 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-nb\") pod \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.710827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-config\") pod \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.710930 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctst5\" (UniqueName: \"kubernetes.io/projected/6c469b81-89a4-4d35-bc9a-b04b82c2571e-kube-api-access-ctst5\") pod \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\" (UID: \"6c469b81-89a4-4d35-bc9a-b04b82c2571e\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.726514 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c469b81-89a4-4d35-bc9a-b04b82c2571e-kube-api-access-ctst5" (OuterVolumeSpecName: "kube-api-access-ctst5") pod "6c469b81-89a4-4d35-bc9a-b04b82c2571e" (UID: "6c469b81-89a4-4d35-bc9a-b04b82c2571e"). InnerVolumeSpecName "kube-api-access-ctst5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.765223 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-config" (OuterVolumeSpecName: "config") pod "6c469b81-89a4-4d35-bc9a-b04b82c2571e" (UID: "6c469b81-89a4-4d35-bc9a-b04b82c2571e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.765841 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c469b81-89a4-4d35-bc9a-b04b82c2571e" (UID: "6c469b81-89a4-4d35-bc9a-b04b82c2571e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.766308 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c469b81-89a4-4d35-bc9a-b04b82c2571e" (UID: "6c469b81-89a4-4d35-bc9a-b04b82c2571e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.780363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c469b81-89a4-4d35-bc9a-b04b82c2571e" (UID: "6c469b81-89a4-4d35-bc9a-b04b82c2571e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.788529 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c469b81-89a4-4d35-bc9a-b04b82c2571e" (UID: "6c469b81-89a4-4d35-bc9a-b04b82c2571e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.795906 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" path="/var/lib/kubelet/pods/2372174a-82ac-4421-a4e7-8ffcd7b4e92f/volumes" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.816978 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data-custom\") pod \"2a827d62-4407-44f4-afc4-b21c06888c13\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.817241 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a827d62-4407-44f4-afc4-b21c06888c13-logs\") pod \"2a827d62-4407-44f4-afc4-b21c06888c13\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.817291 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxpv4\" (UniqueName: \"kubernetes.io/projected/2a827d62-4407-44f4-afc4-b21c06888c13-kube-api-access-wxpv4\") pod \"2a827d62-4407-44f4-afc4-b21c06888c13\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.817382 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-combined-ca-bundle\") pod \"2a827d62-4407-44f4-afc4-b21c06888c13\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.817417 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data\") pod \"2a827d62-4407-44f4-afc4-b21c06888c13\" (UID: \"2a827d62-4407-44f4-afc4-b21c06888c13\") " Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.818384 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.818405 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.818415 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.818454 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctst5\" (UniqueName: \"kubernetes.io/projected/6c469b81-89a4-4d35-bc9a-b04b82c2571e-kube-api-access-ctst5\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.818469 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.818487 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c469b81-89a4-4d35-bc9a-b04b82c2571e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.828418 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2a827d62-4407-44f4-afc4-b21c06888c13" (UID: "2a827d62-4407-44f4-afc4-b21c06888c13"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.828613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a827d62-4407-44f4-afc4-b21c06888c13-logs" (OuterVolumeSpecName: "logs") pod "2a827d62-4407-44f4-afc4-b21c06888c13" (UID: "2a827d62-4407-44f4-afc4-b21c06888c13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.844092 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a827d62-4407-44f4-afc4-b21c06888c13-kube-api-access-wxpv4" (OuterVolumeSpecName: "kube-api-access-wxpv4") pod "2a827d62-4407-44f4-afc4-b21c06888c13" (UID: "2a827d62-4407-44f4-afc4-b21c06888c13"). InnerVolumeSpecName "kube-api-access-wxpv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.864352 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a827d62-4407-44f4-afc4-b21c06888c13" (UID: "2a827d62-4407-44f4-afc4-b21c06888c13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.896325 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data" (OuterVolumeSpecName: "config-data") pod "2a827d62-4407-44f4-afc4-b21c06888c13" (UID: "2a827d62-4407-44f4-afc4-b21c06888c13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.921252 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a827d62-4407-44f4-afc4-b21c06888c13-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.921293 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxpv4\" (UniqueName: \"kubernetes.io/projected/2a827d62-4407-44f4-afc4-b21c06888c13-kube-api-access-wxpv4\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.921305 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.921316 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:10 crc kubenswrapper[4837]: I1014 13:19:10.921326 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a827d62-4407-44f4-afc4-b21c06888c13-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.140750 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.140798 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147192 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-68b7b9db59-mdpgm"] Oct 14 13:19:11 crc kubenswrapper[4837]: E1014 13:19:11.147579 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api-log" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147598 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api-log" Oct 14 13:19:11 crc kubenswrapper[4837]: E1014 13:19:11.147612 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="init" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147619 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="init" Oct 14 13:19:11 crc kubenswrapper[4837]: E1014 13:19:11.147627 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147635 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api" Oct 14 13:19:11 crc kubenswrapper[4837]: E1014 13:19:11.147652 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147657 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" Oct 14 13:19:11 crc kubenswrapper[4837]: E1014 13:19:11.147674 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon-log" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147679 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon-log" Oct 14 13:19:11 crc kubenswrapper[4837]: E1014 13:19:11.147694 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="dnsmasq-dns" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147700 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="dnsmasq-dns" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147863 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api-log" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147874 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="dnsmasq-dns" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147895 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147904 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" containerName="barbican-api" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.147912 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2372174a-82ac-4421-a4e7-8ffcd7b4e92f" containerName="horizon-log" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.148762 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.151034 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.151341 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.151878 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.206029 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-68b7b9db59-mdpgm"] Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9xk\" (UniqueName: \"kubernetes.io/projected/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-kube-api-access-sr9xk\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227599 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-internal-tls-certs\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-config-data\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227682 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-combined-ca-bundle\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-etc-swift\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227769 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-log-httpd\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227820 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-public-tls-certs\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.227839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-run-httpd\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.300432 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b5874f78-zwqfr" event={"ID":"2a827d62-4407-44f4-afc4-b21c06888c13","Type":"ContainerDied","Data":"037a12ddd13f19e5e315c4fbd00959719382628576b53581bbf5564c8e7011c0"} Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.300491 4837 scope.go:117] "RemoveContainer" containerID="e477eea2de62523b2cd64affc2214b6e787a2f94667d7f3f71d3d47bc3bcfc35" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.300622 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b5874f78-zwqfr" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.319297 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.319292 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" event={"ID":"6c469b81-89a4-4d35-bc9a-b04b82c2571e","Type":"ContainerDied","Data":"dcceead411ebf1fe41ae21b7356ab880d8b7974fa3baa4888d19ea8c330528ec"} Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345221 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-config-data\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345296 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-combined-ca-bundle\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-etc-swift\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-log-httpd\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345421 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-public-tls-certs\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-run-httpd\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345519 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9xk\" (UniqueName: \"kubernetes.io/projected/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-kube-api-access-sr9xk\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.345560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-internal-tls-certs\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.348187 4837 scope.go:117] "RemoveContainer" containerID="b6eff1054a07da3ed81401d43598253c43816d01fc221775a0bc1ae5456b3053" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.348575 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-log-httpd\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.352553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-run-httpd\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.356220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-internal-tls-certs\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.360907 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-etc-swift\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.369660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-config-data\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.370913 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-combined-ca-bundle\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.373214 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56b5874f78-zwqfr"] Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.374423 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-public-tls-certs\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.376389 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9xk\" (UniqueName: \"kubernetes.io/projected/7ac8f443-1071-49d6-94d2-e7fea6f09cc5-kube-api-access-sr9xk\") pod \"swift-proxy-68b7b9db59-mdpgm\" (UID: \"7ac8f443-1071-49d6-94d2-e7fea6f09cc5\") " pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.380931 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56b5874f78-zwqfr"] Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.390324 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vfpnl"] Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.396461 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vfpnl"] Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.466135 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.471830 4837 scope.go:117] "RemoveContainer" containerID="9d08ef2e1c90280b6a0b75b4ae79f5bd1ce77f65ce9892790425ec06439d51b6" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.507416 4837 scope.go:117] "RemoveContainer" containerID="07f1f4ad139a0ff3fd1e0f168b30f101cfaf9f882f276f26104ace570e865189" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.794657 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.829096 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.829345 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-central-agent" containerID="cri-o://0bde1f81d042890b7a3b5b737d24072929d61912234fda517a5f54771adaba5f" gracePeriod=30 Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.831673 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="sg-core" containerID="cri-o://c58ba7655a29419ccb05230c455b8c766bbb730cb600598ff0094a57dffba076" gracePeriod=30 Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.831822 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="proxy-httpd" containerID="cri-o://ab87507ffe9280dbefb00787352ba7e2cba8685d5133ab1228bcf833c3f25df9" gracePeriod=30 Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.831862 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-notification-agent" containerID="cri-o://2593160a9c4c902c162deaa393413d006bee9d82c3b266a59a75d4e584cfa929" gracePeriod=30 Oct 14 13:19:11 crc kubenswrapper[4837]: I1014 13:19:11.840470 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:19:12 crc kubenswrapper[4837]: I1014 13:19:12.074241 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-68b7b9db59-mdpgm"] Oct 14 13:19:12 crc kubenswrapper[4837]: W1014 13:19:12.080758 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ac8f443_1071_49d6_94d2_e7fea6f09cc5.slice/crio-2725ddb824974d0e6104bd97b87f6a3986eaa540e2029261b03563ec044bb5c1 WatchSource:0}: Error finding container 2725ddb824974d0e6104bd97b87f6a3986eaa540e2029261b03563ec044bb5c1: Status 404 returned error can't find the container with id 2725ddb824974d0e6104bd97b87f6a3986eaa540e2029261b03563ec044bb5c1 Oct 14 13:19:12 crc kubenswrapper[4837]: I1014 13:19:12.330263 4837 generic.go:334] "Generic (PLEG): container finished" podID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerID="c58ba7655a29419ccb05230c455b8c766bbb730cb600598ff0094a57dffba076" exitCode=2 Oct 14 13:19:12 crc kubenswrapper[4837]: I1014 13:19:12.330483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerDied","Data":"c58ba7655a29419ccb05230c455b8c766bbb730cb600598ff0094a57dffba076"} Oct 14 13:19:12 crc kubenswrapper[4837]: I1014 13:19:12.333609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b7b9db59-mdpgm" event={"ID":"7ac8f443-1071-49d6-94d2-e7fea6f09cc5","Type":"ContainerStarted","Data":"2725ddb824974d0e6104bd97b87f6a3986eaa540e2029261b03563ec044bb5c1"} Oct 14 13:19:12 crc kubenswrapper[4837]: I1014 13:19:12.804583 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a827d62-4407-44f4-afc4-b21c06888c13" path="/var/lib/kubelet/pods/2a827d62-4407-44f4-afc4-b21c06888c13/volumes" Oct 14 13:19:12 crc kubenswrapper[4837]: I1014 13:19:12.805399 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" path="/var/lib/kubelet/pods/6c469b81-89a4-4d35-bc9a-b04b82c2571e/volumes" Oct 14 13:19:13 crc kubenswrapper[4837]: I1014 13:19:13.348263 4837 generic.go:334] "Generic (PLEG): container finished" podID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerID="ab87507ffe9280dbefb00787352ba7e2cba8685d5133ab1228bcf833c3f25df9" exitCode=0 Oct 14 13:19:13 crc kubenswrapper[4837]: I1014 13:19:13.348646 4837 generic.go:334] "Generic (PLEG): container finished" podID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerID="0bde1f81d042890b7a3b5b737d24072929d61912234fda517a5f54771adaba5f" exitCode=0 Oct 14 13:19:13 crc kubenswrapper[4837]: I1014 13:19:13.348438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerDied","Data":"ab87507ffe9280dbefb00787352ba7e2cba8685d5133ab1228bcf833c3f25df9"} Oct 14 13:19:13 crc kubenswrapper[4837]: I1014 13:19:13.348751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerDied","Data":"0bde1f81d042890b7a3b5b737d24072929d61912234fda517a5f54771adaba5f"} Oct 14 13:19:13 crc kubenswrapper[4837]: I1014 13:19:13.352500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b7b9db59-mdpgm" event={"ID":"7ac8f443-1071-49d6-94d2-e7fea6f09cc5","Type":"ContainerStarted","Data":"bb6266e0d039339616fa74e50400090264310237991074d4828668229f2bbda0"} Oct 14 13:19:14 crc kubenswrapper[4837]: I1014 13:19:14.091551 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.155:3000/\": dial tcp 10.217.0.155:3000: connect: connection refused" Oct 14 13:19:15 crc kubenswrapper[4837]: I1014 13:19:15.522697 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-vfpnl" podUID="6c469b81-89a4-4d35-bc9a-b04b82c2571e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Oct 14 13:19:16 crc kubenswrapper[4837]: I1014 13:19:16.385877 4837 generic.go:334] "Generic (PLEG): container finished" podID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerID="2593160a9c4c902c162deaa393413d006bee9d82c3b266a59a75d4e584cfa929" exitCode=0 Oct 14 13:19:16 crc kubenswrapper[4837]: I1014 13:19:16.385917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerDied","Data":"2593160a9c4c902c162deaa393413d006bee9d82c3b266a59a75d4e584cfa929"} Oct 14 13:19:17 crc kubenswrapper[4837]: I1014 13:19:17.045296 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 14 13:19:17 crc kubenswrapper[4837]: I1014 13:19:17.213750 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:19:17 crc kubenswrapper[4837]: I1014 13:19:17.219371 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67c99f9644-lpk76" Oct 14 13:19:18 crc kubenswrapper[4837]: I1014 13:19:18.358710 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:19:18 crc kubenswrapper[4837]: I1014 13:19:18.359542 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-httpd" containerID="cri-o://459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278" gracePeriod=30 Oct 14 13:19:18 crc kubenswrapper[4837]: I1014 13:19:18.359494 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-log" containerID="cri-o://91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6" gracePeriod=30 Oct 14 13:19:18 crc kubenswrapper[4837]: I1014 13:19:18.855034 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.011864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-config-data\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012003 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-scripts\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012035 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-run-httpd\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012057 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-sg-core-conf-yaml\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012093 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-combined-ca-bundle\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012124 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p67mg\" (UniqueName: \"kubernetes.io/projected/148e6967-e15d-4c5c-89db-5a029e0ce45b-kube-api-access-p67mg\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012146 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-log-httpd\") pod \"148e6967-e15d-4c5c-89db-5a029e0ce45b\" (UID: \"148e6967-e15d-4c5c-89db-5a029e0ce45b\") " Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.012902 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.018427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148e6967-e15d-4c5c-89db-5a029e0ce45b-kube-api-access-p67mg" (OuterVolumeSpecName: "kube-api-access-p67mg") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "kube-api-access-p67mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.020262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-scripts" (OuterVolumeSpecName: "scripts") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.048794 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.088606 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.115259 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.115295 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.115306 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.115315 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.115323 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p67mg\" (UniqueName: \"kubernetes.io/projected/148e6967-e15d-4c5c-89db-5a029e0ce45b-kube-api-access-p67mg\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.115332 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/148e6967-e15d-4c5c-89db-5a029e0ce45b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.139269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-config-data" (OuterVolumeSpecName: "config-data") pod "148e6967-e15d-4c5c-89db-5a029e0ce45b" (UID: "148e6967-e15d-4c5c-89db-5a029e0ce45b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.216378 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148e6967-e15d-4c5c-89db-5a029e0ce45b-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.226736 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.227078 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-httpd" containerID="cri-o://7f25a0404cc944c68b93645f8ffbce8a82ecc4281bf1bcb1f9ddbaaba056010e" gracePeriod=30 Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.227003 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-log" containerID="cri-o://cb88e62a1892c980c3120b04651c4bf852a694eb483b15b3ef58a88d1cef8227" gracePeriod=30 Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.414340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75fffdca-61c2-4af0-a87d-1662358aa171","Type":"ContainerStarted","Data":"5dfd52554a0aba6e8e3757041fd21db11ea0b91edcc9e0760fe6d6e113f6463b"} Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.416911 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-68b7b9db59-mdpgm" event={"ID":"7ac8f443-1071-49d6-94d2-e7fea6f09cc5","Type":"ContainerStarted","Data":"b88ef132b22282af3079e5b670bedf10ec9741946b5399de950c0bccd73be412"} Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.417032 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.417124 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.419404 4837 generic.go:334] "Generic (PLEG): container finished" podID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerID="cb88e62a1892c980c3120b04651c4bf852a694eb483b15b3ef58a88d1cef8227" exitCode=143 Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.419474 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3b427fc-538b-4823-8ef3-8bab1765faee","Type":"ContainerDied","Data":"cb88e62a1892c980c3120b04651c4bf852a694eb483b15b3ef58a88d1cef8227"} Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.423748 4837 generic.go:334] "Generic (PLEG): container finished" podID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerID="91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6" exitCode=143 Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.423818 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c72488b0-037a-4284-b428-e1907b3aa9ae","Type":"ContainerDied","Data":"91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6"} Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.433554 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-68b7b9db59-mdpgm" podUID="7ac8f443-1071-49d6-94d2-e7fea6f09cc5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.434984 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"148e6967-e15d-4c5c-89db-5a029e0ce45b","Type":"ContainerDied","Data":"6d525a0e74ce462b8965d81f151beb4df1d6fd0e26672c765206914758e4517a"} Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.435048 4837 scope.go:117] "RemoveContainer" containerID="ab87507ffe9280dbefb00787352ba7e2cba8685d5133ab1228bcf833c3f25df9" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.435063 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.446439 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.526093925 podStartE2EDuration="16.446419919s" podCreationTimestamp="2025-10-14 13:19:03 +0000 UTC" firstStartedPulling="2025-10-14 13:19:04.232399104 +0000 UTC m=+1082.149398937" lastFinishedPulling="2025-10-14 13:19:19.152725118 +0000 UTC m=+1097.069724931" observedRunningTime="2025-10-14 13:19:19.437437578 +0000 UTC m=+1097.354437391" watchObservedRunningTime="2025-10-14 13:19:19.446419919 +0000 UTC m=+1097.363419732" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.476872 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-68b7b9db59-mdpgm" podStartSLOduration=8.476847773 podStartE2EDuration="8.476847773s" podCreationTimestamp="2025-10-14 13:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:19.46330814 +0000 UTC m=+1097.380307963" watchObservedRunningTime="2025-10-14 13:19:19.476847773 +0000 UTC m=+1097.393847586" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.508490 4837 scope.go:117] "RemoveContainer" containerID="c58ba7655a29419ccb05230c455b8c766bbb730cb600598ff0094a57dffba076" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.514751 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.521740 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538328 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:19 crc kubenswrapper[4837]: E1014 13:19:19.538665 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="proxy-httpd" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538682 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="proxy-httpd" Oct 14 13:19:19 crc kubenswrapper[4837]: E1014 13:19:19.538697 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-notification-agent" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538705 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-notification-agent" Oct 14 13:19:19 crc kubenswrapper[4837]: E1014 13:19:19.538714 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-central-agent" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538720 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-central-agent" Oct 14 13:19:19 crc kubenswrapper[4837]: E1014 13:19:19.538756 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="sg-core" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538762 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="sg-core" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538911 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-central-agent" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538926 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="ceilometer-notification-agent" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538941 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="proxy-httpd" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.538954 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" containerName="sg-core" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.539241 4837 scope.go:117] "RemoveContainer" containerID="2593160a9c4c902c162deaa393413d006bee9d82c3b266a59a75d4e584cfa929" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.541468 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.549193 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.549352 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.556551 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.575523 4837 scope.go:117] "RemoveContainer" containerID="0bde1f81d042890b7a3b5b737d24072929d61912234fda517a5f54771adaba5f" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-run-httpd\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623120 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-scripts\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623181 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-log-httpd\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzsxv\" (UniqueName: \"kubernetes.io/projected/8329e863-be1f-4eee-a002-5a1ed17195ac-kube-api-access-zzsxv\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623281 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-config-data\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623302 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.623341 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-run-httpd\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725621 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-scripts\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725675 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-log-httpd\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzsxv\" (UniqueName: \"kubernetes.io/projected/8329e863-be1f-4eee-a002-5a1ed17195ac-kube-api-access-zzsxv\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-config-data\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.725854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.726083 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-run-httpd\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.727348 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-log-httpd\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.748909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-scripts\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.749689 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzsxv\" (UniqueName: \"kubernetes.io/projected/8329e863-be1f-4eee-a002-5a1ed17195ac-kube-api-access-zzsxv\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.750965 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-config-data\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.751054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.756647 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.877418 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:19 crc kubenswrapper[4837]: I1014 13:19:19.921024 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:20 crc kubenswrapper[4837]: W1014 13:19:20.350230 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8329e863_be1f_4eee_a002_5a1ed17195ac.slice/crio-250d93f8134bea531e0e10ae641493c8c5fd0bba48be34804f6a9513f9152a48 WatchSource:0}: Error finding container 250d93f8134bea531e0e10ae641493c8c5fd0bba48be34804f6a9513f9152a48: Status 404 returned error can't find the container with id 250d93f8134bea531e0e10ae641493c8c5fd0bba48be34804f6a9513f9152a48 Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.351224 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.442917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerStarted","Data":"250d93f8134bea531e0e10ae641493c8c5fd0bba48be34804f6a9513f9152a48"} Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.456389 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.770829 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-trsrp"] Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.772254 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.795119 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148e6967-e15d-4c5c-89db-5a029e0ce45b" path="/var/lib/kubelet/pods/148e6967-e15d-4c5c-89db-5a029e0ce45b/volumes" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.795853 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-trsrp"] Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.842930 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwj6\" (UniqueName: \"kubernetes.io/projected/cb637136-1a3b-4d9c-a991-619e80c8cf31-kube-api-access-2wwj6\") pod \"nova-api-db-create-trsrp\" (UID: \"cb637136-1a3b-4d9c-a991-619e80c8cf31\") " pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.869457 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jpbct"] Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.872566 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.880865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jpbct"] Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.944208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8xf\" (UniqueName: \"kubernetes.io/projected/b359e514-29e9-456b-814d-7e86c9f18e4c-kube-api-access-xj8xf\") pod \"nova-cell0-db-create-jpbct\" (UID: \"b359e514-29e9-456b-814d-7e86c9f18e4c\") " pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.944272 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwj6\" (UniqueName: \"kubernetes.io/projected/cb637136-1a3b-4d9c-a991-619e80c8cf31-kube-api-access-2wwj6\") pod \"nova-api-db-create-trsrp\" (UID: \"cb637136-1a3b-4d9c-a991-619e80c8cf31\") " pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:20 crc kubenswrapper[4837]: I1014 13:19:20.971901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwj6\" (UniqueName: \"kubernetes.io/projected/cb637136-1a3b-4d9c-a991-619e80c8cf31-kube-api-access-2wwj6\") pod \"nova-api-db-create-trsrp\" (UID: \"cb637136-1a3b-4d9c-a991-619e80c8cf31\") " pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.045840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8xf\" (UniqueName: \"kubernetes.io/projected/b359e514-29e9-456b-814d-7e86c9f18e4c-kube-api-access-xj8xf\") pod \"nova-cell0-db-create-jpbct\" (UID: \"b359e514-29e9-456b-814d-7e86c9f18e4c\") " pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.078673 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8xf\" (UniqueName: \"kubernetes.io/projected/b359e514-29e9-456b-814d-7e86c9f18e4c-kube-api-access-xj8xf\") pod \"nova-cell0-db-create-jpbct\" (UID: \"b359e514-29e9-456b-814d-7e86c9f18e4c\") " pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.084810 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mmjlq"] Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.086363 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.104088 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.104431 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mmjlq"] Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.193436 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.252564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5k8\" (UniqueName: \"kubernetes.io/projected/7f3ab4e0-d484-4d59-a19d-c6c3c197542f-kube-api-access-hb5k8\") pod \"nova-cell1-db-create-mmjlq\" (UID: \"7f3ab4e0-d484-4d59-a19d-c6c3c197542f\") " pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.355056 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5k8\" (UniqueName: \"kubernetes.io/projected/7f3ab4e0-d484-4d59-a19d-c6c3c197542f-kube-api-access-hb5k8\") pod \"nova-cell1-db-create-mmjlq\" (UID: \"7f3ab4e0-d484-4d59-a19d-c6c3c197542f\") " pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.416111 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5k8\" (UniqueName: \"kubernetes.io/projected/7f3ab4e0-d484-4d59-a19d-c6c3c197542f-kube-api-access-hb5k8\") pod \"nova-cell1-db-create-mmjlq\" (UID: \"7f3ab4e0-d484-4d59-a19d-c6c3c197542f\") " pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.469243 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerStarted","Data":"c4befa47898e817cd576ef141eee4e99c8139b17cefd80fe90c48fc5e1a4cfcf"} Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.509779 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.607866 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-trsrp"] Oct 14 13:19:21 crc kubenswrapper[4837]: W1014 13:19:21.651467 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb637136_1a3b_4d9c_a991_619e80c8cf31.slice/crio-48ea2475db06a515b4b062fe6f1d234ab7974589119fdad89590d5e8b65b5eb5 WatchSource:0}: Error finding container 48ea2475db06a515b4b062fe6f1d234ab7974589119fdad89590d5e8b65b5eb5: Status 404 returned error can't find the container with id 48ea2475db06a515b4b062fe6f1d234ab7974589119fdad89590d5e8b65b5eb5 Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.781870 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jpbct"] Oct 14 13:19:21 crc kubenswrapper[4837]: I1014 13:19:21.998623 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mmjlq"] Oct 14 13:19:22 crc kubenswrapper[4837]: W1014 13:19:22.068366 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3ab4e0_d484_4d59_a19d_c6c3c197542f.slice/crio-d6adbabaf07f7cb9826382f45934828f1646cfc131c2ef2a462b852877223ffd WatchSource:0}: Error finding container d6adbabaf07f7cb9826382f45934828f1646cfc131c2ef2a462b852877223ffd: Status 404 returned error can't find the container with id d6adbabaf07f7cb9826382f45934828f1646cfc131c2ef2a462b852877223ffd Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.129430 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-scripts\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281256 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-combined-ca-bundle\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281311 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvrhs\" (UniqueName: \"kubernetes.io/projected/c72488b0-037a-4284-b428-e1907b3aa9ae-kube-api-access-qvrhs\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281341 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-public-tls-certs\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281366 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-logs\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281401 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281501 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-config-data\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.281655 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-httpd-run\") pod \"c72488b0-037a-4284-b428-e1907b3aa9ae\" (UID: \"c72488b0-037a-4284-b428-e1907b3aa9ae\") " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.282316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.282519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-logs" (OuterVolumeSpecName: "logs") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.287197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.288038 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-scripts" (OuterVolumeSpecName: "scripts") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.288684 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72488b0-037a-4284-b428-e1907b3aa9ae-kube-api-access-qvrhs" (OuterVolumeSpecName: "kube-api-access-qvrhs") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "kube-api-access-qvrhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.315388 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.347609 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.357356 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-config-data" (OuterVolumeSpecName: "config-data") pod "c72488b0-037a-4284-b428-e1907b3aa9ae" (UID: "c72488b0-037a-4284-b428-e1907b3aa9ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.383989 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384022 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384031 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384039 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384050 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvrhs\" (UniqueName: \"kubernetes.io/projected/c72488b0-037a-4284-b428-e1907b3aa9ae-kube-api-access-qvrhs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384058 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72488b0-037a-4284-b428-e1907b3aa9ae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384066 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c72488b0-037a-4284-b428-e1907b3aa9ae-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.384099 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.403352 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.485421 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.494339 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerStarted","Data":"da68eb6eb43ca1a895d56ce7df284447bbfa84020d2a5e6c6690446e84a95f1c"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.496751 4837 generic.go:334] "Generic (PLEG): container finished" podID="7f3ab4e0-d484-4d59-a19d-c6c3c197542f" containerID="8d8fb652e7baf9b186ba9013bb794f0a2da8988ba35ab26ebb63b8dce919f5d3" exitCode=0 Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.497291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mmjlq" event={"ID":"7f3ab4e0-d484-4d59-a19d-c6c3c197542f","Type":"ContainerDied","Data":"8d8fb652e7baf9b186ba9013bb794f0a2da8988ba35ab26ebb63b8dce919f5d3"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.497320 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mmjlq" event={"ID":"7f3ab4e0-d484-4d59-a19d-c6c3c197542f","Type":"ContainerStarted","Data":"d6adbabaf07f7cb9826382f45934828f1646cfc131c2ef2a462b852877223ffd"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.498995 4837 generic.go:334] "Generic (PLEG): container finished" podID="cb637136-1a3b-4d9c-a991-619e80c8cf31" containerID="f2d7760bc7810e242f6f5676cb97d1341e34ac15e9d05d294bc7811f77a52ad2" exitCode=0 Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.499050 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trsrp" event={"ID":"cb637136-1a3b-4d9c-a991-619e80c8cf31","Type":"ContainerDied","Data":"f2d7760bc7810e242f6f5676cb97d1341e34ac15e9d05d294bc7811f77a52ad2"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.499073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trsrp" event={"ID":"cb637136-1a3b-4d9c-a991-619e80c8cf31","Type":"ContainerStarted","Data":"48ea2475db06a515b4b062fe6f1d234ab7974589119fdad89590d5e8b65b5eb5"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.501511 4837 generic.go:334] "Generic (PLEG): container finished" podID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerID="7f25a0404cc944c68b93645f8ffbce8a82ecc4281bf1bcb1f9ddbaaba056010e" exitCode=0 Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.501599 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3b427fc-538b-4823-8ef3-8bab1765faee","Type":"ContainerDied","Data":"7f25a0404cc944c68b93645f8ffbce8a82ecc4281bf1bcb1f9ddbaaba056010e"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.503700 4837 generic.go:334] "Generic (PLEG): container finished" podID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerID="459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278" exitCode=0 Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.503762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c72488b0-037a-4284-b428-e1907b3aa9ae","Type":"ContainerDied","Data":"459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.503797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c72488b0-037a-4284-b428-e1907b3aa9ae","Type":"ContainerDied","Data":"528b724c93f160c0d15cf8173200a96563091d365af2d45cba29616e47f6b4b4"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.503818 4837 scope.go:117] "RemoveContainer" containerID="459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.503960 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.515511 4837 generic.go:334] "Generic (PLEG): container finished" podID="b359e514-29e9-456b-814d-7e86c9f18e4c" containerID="452e3e4995effccd4e1e14fb9ef73c5dfe426ecdbb529c37281ebb0b866b2f21" exitCode=0 Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.515552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpbct" event={"ID":"b359e514-29e9-456b-814d-7e86c9f18e4c","Type":"ContainerDied","Data":"452e3e4995effccd4e1e14fb9ef73c5dfe426ecdbb529c37281ebb0b866b2f21"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.515579 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpbct" event={"ID":"b359e514-29e9-456b-814d-7e86c9f18e4c","Type":"ContainerStarted","Data":"4444c36d9b934428e6ce3475f405666348d879b47e23d48d5fced0e137fa24b0"} Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.561895 4837 scope.go:117] "RemoveContainer" containerID="91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.592226 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.597302 4837 scope.go:117] "RemoveContainer" containerID="459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278" Oct 14 13:19:22 crc kubenswrapper[4837]: E1014 13:19:22.598015 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278\": container with ID starting with 459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278 not found: ID does not exist" containerID="459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.598053 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278"} err="failed to get container status \"459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278\": rpc error: code = NotFound desc = could not find container \"459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278\": container with ID starting with 459ba94f170b19d4fa7cedab0cc647f5ed43c075076d4d76eb82ad466a181278 not found: ID does not exist" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.598087 4837 scope.go:117] "RemoveContainer" containerID="91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6" Oct 14 13:19:22 crc kubenswrapper[4837]: E1014 13:19:22.598703 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6\": container with ID starting with 91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6 not found: ID does not exist" containerID="91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.598742 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6"} err="failed to get container status \"91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6\": rpc error: code = NotFound desc = could not find container \"91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6\": container with ID starting with 91d173467ad59bfa59b7cb26b6f3222c7c9c07b1257776204b932089162f1bf6 not found: ID does not exist" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.599889 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.625237 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:19:22 crc kubenswrapper[4837]: E1014 13:19:22.632172 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-log" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.632212 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-log" Oct 14 13:19:22 crc kubenswrapper[4837]: E1014 13:19:22.632243 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-httpd" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.632249 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-httpd" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.632894 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-log" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.632917 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" containerName="glance-httpd" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.634530 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.638087 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.642631 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.648052 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796087 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6304802-caa4-4ed2-a570-fc09f7c940b5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6304802-caa4-4ed2-a570-fc09f7c940b5-logs\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796150 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796233 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bh2n\" (UniqueName: \"kubernetes.io/projected/a6304802-caa4-4ed2-a570-fc09f7c940b5-kube-api-access-5bh2n\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796331 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.796360 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.857391 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72488b0-037a-4284-b428-e1907b3aa9ae" path="/var/lib/kubelet/pods/c72488b0-037a-4284-b428-e1907b3aa9ae/volumes" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.897910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6304802-caa4-4ed2-a570-fc09f7c940b5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.897958 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6304802-caa4-4ed2-a570-fc09f7c940b5-logs\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.897992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.898044 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.898081 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bh2n\" (UniqueName: \"kubernetes.io/projected/a6304802-caa4-4ed2-a570-fc09f7c940b5-kube-api-access-5bh2n\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.898124 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.898147 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.898197 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.899792 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6304802-caa4-4ed2-a570-fc09f7c940b5-logs\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.899998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6304802-caa4-4ed2-a570-fc09f7c940b5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.901050 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.903430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.905235 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.905516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.918011 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6304802-caa4-4ed2-a570-fc09f7c940b5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.920976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bh2n\" (UniqueName: \"kubernetes.io/projected/a6304802-caa4-4ed2-a570-fc09f7c940b5-kube-api-access-5bh2n\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.958400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"a6304802-caa4-4ed2-a570-fc09f7c940b5\") " pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.972300 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 14 13:19:22 crc kubenswrapper[4837]: I1014 13:19:22.976007 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108137 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-internal-tls-certs\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108214 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-scripts\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108240 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-config-data\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-logs\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-combined-ca-bundle\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n455k\" (UniqueName: \"kubernetes.io/projected/e3b427fc-538b-4823-8ef3-8bab1765faee-kube-api-access-n455k\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.108499 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-httpd-run\") pod \"e3b427fc-538b-4823-8ef3-8bab1765faee\" (UID: \"e3b427fc-538b-4823-8ef3-8bab1765faee\") " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.109322 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-logs" (OuterVolumeSpecName: "logs") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.109983 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.119873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-scripts" (OuterVolumeSpecName: "scripts") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.120043 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b427fc-538b-4823-8ef3-8bab1765faee-kube-api-access-n455k" (OuterVolumeSpecName: "kube-api-access-n455k") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "kube-api-access-n455k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.125769 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.144445 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.205838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.208306 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-config-data" (OuterVolumeSpecName: "config-data") pod "e3b427fc-538b-4823-8ef3-8bab1765faee" (UID: "e3b427fc-538b-4823-8ef3-8bab1765faee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210595 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n455k\" (UniqueName: \"kubernetes.io/projected/e3b427fc-538b-4823-8ef3-8bab1765faee-kube-api-access-n455k\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210645 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210658 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210670 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210680 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210691 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210709 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b427fc-538b-4823-8ef3-8bab1765faee-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.210720 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b427fc-538b-4823-8ef3-8bab1765faee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.238424 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.311758 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.526248 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3b427fc-538b-4823-8ef3-8bab1765faee","Type":"ContainerDied","Data":"5ad241804e54013bfd390e2f2d37fdcc9e46991b75a10f9affa360a0b8dd9f25"} Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.526605 4837 scope.go:117] "RemoveContainer" containerID="7f25a0404cc944c68b93645f8ffbce8a82ecc4281bf1bcb1f9ddbaaba056010e" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.526278 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.530138 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerStarted","Data":"6a064235e84fee9beec50e8dee671551d37af489714ca51d8bf03fb94cc28c6f"} Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.563631 4837 scope.go:117] "RemoveContainer" containerID="cb88e62a1892c980c3120b04651c4bf852a694eb483b15b3ef58a88d1cef8227" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.572662 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.591621 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.610215 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.619534 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:19:23 crc kubenswrapper[4837]: E1014 13:19:23.620038 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-log" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.620058 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-log" Oct 14 13:19:23 crc kubenswrapper[4837]: E1014 13:19:23.620119 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-httpd" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.620130 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-httpd" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.620657 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-log" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.620712 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-httpd" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.624461 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.628655 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.628904 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.629030 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719686 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719785 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719820 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719856 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4994k\" (UniqueName: \"kubernetes.io/projected/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-kube-api-access-4994k\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719924 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.719984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.720029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821483 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821649 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4994k\" (UniqueName: \"kubernetes.io/projected/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-kube-api-access-4994k\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.821832 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.822051 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.822351 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.823612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.827905 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.828000 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.828713 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.843567 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.844274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4994k\" (UniqueName: \"kubernetes.io/projected/d215f9ee-cdfe-47a1-8240-e74f9f81f97d-kube-api-access-4994k\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:23 crc kubenswrapper[4837]: I1014 13:19:23.863468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d215f9ee-cdfe-47a1-8240-e74f9f81f97d\") " pod="openstack/glance-default-internal-api-0" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.031877 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.045774 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.064141 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.126124 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwj6\" (UniqueName: \"kubernetes.io/projected/cb637136-1a3b-4d9c-a991-619e80c8cf31-kube-api-access-2wwj6\") pod \"cb637136-1a3b-4d9c-a991-619e80c8cf31\" (UID: \"cb637136-1a3b-4d9c-a991-619e80c8cf31\") " Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.126202 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj8xf\" (UniqueName: \"kubernetes.io/projected/b359e514-29e9-456b-814d-7e86c9f18e4c-kube-api-access-xj8xf\") pod \"b359e514-29e9-456b-814d-7e86c9f18e4c\" (UID: \"b359e514-29e9-456b-814d-7e86c9f18e4c\") " Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.131225 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb637136-1a3b-4d9c-a991-619e80c8cf31-kube-api-access-2wwj6" (OuterVolumeSpecName: "kube-api-access-2wwj6") pod "cb637136-1a3b-4d9c-a991-619e80c8cf31" (UID: "cb637136-1a3b-4d9c-a991-619e80c8cf31"). InnerVolumeSpecName "kube-api-access-2wwj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.131559 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b359e514-29e9-456b-814d-7e86c9f18e4c-kube-api-access-xj8xf" (OuterVolumeSpecName: "kube-api-access-xj8xf") pod "b359e514-29e9-456b-814d-7e86c9f18e4c" (UID: "b359e514-29e9-456b-814d-7e86c9f18e4c"). InnerVolumeSpecName "kube-api-access-xj8xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.143757 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.227532 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb5k8\" (UniqueName: \"kubernetes.io/projected/7f3ab4e0-d484-4d59-a19d-c6c3c197542f-kube-api-access-hb5k8\") pod \"7f3ab4e0-d484-4d59-a19d-c6c3c197542f\" (UID: \"7f3ab4e0-d484-4d59-a19d-c6c3c197542f\") " Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.227882 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwj6\" (UniqueName: \"kubernetes.io/projected/cb637136-1a3b-4d9c-a991-619e80c8cf31-kube-api-access-2wwj6\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.227893 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj8xf\" (UniqueName: \"kubernetes.io/projected/b359e514-29e9-456b-814d-7e86c9f18e4c-kube-api-access-xj8xf\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.233015 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3ab4e0-d484-4d59-a19d-c6c3c197542f-kube-api-access-hb5k8" (OuterVolumeSpecName: "kube-api-access-hb5k8") pod "7f3ab4e0-d484-4d59-a19d-c6c3c197542f" (UID: "7f3ab4e0-d484-4d59-a19d-c6c3c197542f"). InnerVolumeSpecName "kube-api-access-hb5k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.330704 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb5k8\" (UniqueName: \"kubernetes.io/projected/7f3ab4e0-d484-4d59-a19d-c6c3c197542f-kube-api-access-hb5k8\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.560466 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpbct" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.563185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpbct" event={"ID":"b359e514-29e9-456b-814d-7e86c9f18e4c","Type":"ContainerDied","Data":"4444c36d9b934428e6ce3475f405666348d879b47e23d48d5fced0e137fa24b0"} Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.563235 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4444c36d9b934428e6ce3475f405666348d879b47e23d48d5fced0e137fa24b0" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.566402 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6304802-caa4-4ed2-a570-fc09f7c940b5","Type":"ContainerStarted","Data":"cf550ce90c85ca7c93412d65a1feede518587a2b54ea3b27b4ca71c93763261e"} Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.566449 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6304802-caa4-4ed2-a570-fc09f7c940b5","Type":"ContainerStarted","Data":"9e65b111bce2b77d78826095354a42c8e7092450c47d1bb27339344787a229aa"} Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.580013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mmjlq" event={"ID":"7f3ab4e0-d484-4d59-a19d-c6c3c197542f","Type":"ContainerDied","Data":"d6adbabaf07f7cb9826382f45934828f1646cfc131c2ef2a462b852877223ffd"} Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.580084 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6adbabaf07f7cb9826382f45934828f1646cfc131c2ef2a462b852877223ffd" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.584411 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mmjlq" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.597884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trsrp" event={"ID":"cb637136-1a3b-4d9c-a991-619e80c8cf31","Type":"ContainerDied","Data":"48ea2475db06a515b4b062fe6f1d234ab7974589119fdad89590d5e8b65b5eb5"} Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.598541 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48ea2475db06a515b4b062fe6f1d234ab7974589119fdad89590d5e8b65b5eb5" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.598222 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trsrp" Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.695027 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 14 13:19:24 crc kubenswrapper[4837]: I1014 13:19:24.798663 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" path="/var/lib/kubelet/pods/e3b427fc-538b-4823-8ef3-8bab1765faee/volumes" Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.609819 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d215f9ee-cdfe-47a1-8240-e74f9f81f97d","Type":"ContainerStarted","Data":"37bb0fbd975cc7dbb493ddf2b81bab2633a02ff21781ca9985856de79eb125e5"} Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.610474 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d215f9ee-cdfe-47a1-8240-e74f9f81f97d","Type":"ContainerStarted","Data":"d58cade4d880d830fb074cfdcc22fa155c6294ad49303b88dbc5c2defdb69011"} Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.612483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerStarted","Data":"ae22850200d2bdfeae131a3fc5f5d23362efead182fe2123e5b1872eff284a8c"} Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.612572 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-central-agent" containerID="cri-o://c4befa47898e817cd576ef141eee4e99c8139b17cefd80fe90c48fc5e1a4cfcf" gracePeriod=30 Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.612610 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.612624 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="sg-core" containerID="cri-o://6a064235e84fee9beec50e8dee671551d37af489714ca51d8bf03fb94cc28c6f" gracePeriod=30 Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.612642 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="proxy-httpd" containerID="cri-o://ae22850200d2bdfeae131a3fc5f5d23362efead182fe2123e5b1872eff284a8c" gracePeriod=30 Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.612681 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-notification-agent" containerID="cri-o://da68eb6eb43ca1a895d56ce7df284447bbfa84020d2a5e6c6690446e84a95f1c" gracePeriod=30 Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.618291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6304802-caa4-4ed2-a570-fc09f7c940b5","Type":"ContainerStarted","Data":"e1dfdb8d9b367281889f60fbada6f485d459a2967f1c285053d5f50c499a5d57"} Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.664649 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.10324725 podStartE2EDuration="6.664630558s" podCreationTimestamp="2025-10-14 13:19:19 +0000 UTC" firstStartedPulling="2025-10-14 13:19:20.352569199 +0000 UTC m=+1098.269569012" lastFinishedPulling="2025-10-14 13:19:24.913952507 +0000 UTC m=+1102.830952320" observedRunningTime="2025-10-14 13:19:25.634385688 +0000 UTC m=+1103.551385501" watchObservedRunningTime="2025-10-14 13:19:25.664630558 +0000 UTC m=+1103.581630371" Oct 14 13:19:25 crc kubenswrapper[4837]: I1014 13:19:25.668342 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.668324276 podStartE2EDuration="3.668324276s" podCreationTimestamp="2025-10-14 13:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:25.666645311 +0000 UTC m=+1103.583645114" watchObservedRunningTime="2025-10-14 13:19:25.668324276 +0000 UTC m=+1103.585324089" Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.472690 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-68b7b9db59-mdpgm" Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.628389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d215f9ee-cdfe-47a1-8240-e74f9f81f97d","Type":"ContainerStarted","Data":"c7fcde6d38f5cc41e8127807dad923f1125e5f24e7a7578eed1237a54d6d8177"} Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.631359 4837 generic.go:334] "Generic (PLEG): container finished" podID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerID="ae22850200d2bdfeae131a3fc5f5d23362efead182fe2123e5b1872eff284a8c" exitCode=0 Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.631386 4837 generic.go:334] "Generic (PLEG): container finished" podID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerID="6a064235e84fee9beec50e8dee671551d37af489714ca51d8bf03fb94cc28c6f" exitCode=2 Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.631394 4837 generic.go:334] "Generic (PLEG): container finished" podID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerID="da68eb6eb43ca1a895d56ce7df284447bbfa84020d2a5e6c6690446e84a95f1c" exitCode=0 Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.632253 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerDied","Data":"ae22850200d2bdfeae131a3fc5f5d23362efead182fe2123e5b1872eff284a8c"} Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.632287 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerDied","Data":"6a064235e84fee9beec50e8dee671551d37af489714ca51d8bf03fb94cc28c6f"} Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.632301 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerDied","Data":"da68eb6eb43ca1a895d56ce7df284447bbfa84020d2a5e6c6690446e84a95f1c"} Oct 14 13:19:26 crc kubenswrapper[4837]: I1014 13:19:26.654452 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.654426489 podStartE2EDuration="3.654426489s" podCreationTimestamp="2025-10-14 13:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:26.646151937 +0000 UTC m=+1104.563151770" watchObservedRunningTime="2025-10-14 13:19:26.654426489 +0000 UTC m=+1104.571426302" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.642715 4837 generic.go:334] "Generic (PLEG): container finished" podID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerID="c4befa47898e817cd576ef141eee4e99c8139b17cefd80fe90c48fc5e1a4cfcf" exitCode=0 Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.642818 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerDied","Data":"c4befa47898e817cd576ef141eee4e99c8139b17cefd80fe90c48fc5e1a4cfcf"} Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.760263 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.858964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-log-httpd\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859082 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzsxv\" (UniqueName: \"kubernetes.io/projected/8329e863-be1f-4eee-a002-5a1ed17195ac-kube-api-access-zzsxv\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859148 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-combined-ca-bundle\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859290 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-config-data\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859322 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-run-httpd\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859430 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-scripts\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-sg-core-conf-yaml\") pod \"8329e863-be1f-4eee-a002-5a1ed17195ac\" (UID: \"8329e863-be1f-4eee-a002-5a1ed17195ac\") " Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859610 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.859697 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.860091 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.860115 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8329e863-be1f-4eee-a002-5a1ed17195ac-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.864778 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8329e863-be1f-4eee-a002-5a1ed17195ac-kube-api-access-zzsxv" (OuterVolumeSpecName: "kube-api-access-zzsxv") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "kube-api-access-zzsxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.865930 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-scripts" (OuterVolumeSpecName: "scripts") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.894648 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.944385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.961689 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.961721 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.961733 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzsxv\" (UniqueName: \"kubernetes.io/projected/8329e863-be1f-4eee-a002-5a1ed17195ac-kube-api-access-zzsxv\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.961741 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:27 crc kubenswrapper[4837]: I1014 13:19:27.986336 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-config-data" (OuterVolumeSpecName: "config-data") pod "8329e863-be1f-4eee-a002-5a1ed17195ac" (UID: "8329e863-be1f-4eee-a002-5a1ed17195ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.064210 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8329e863-be1f-4eee-a002-5a1ed17195ac-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.653769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8329e863-be1f-4eee-a002-5a1ed17195ac","Type":"ContainerDied","Data":"250d93f8134bea531e0e10ae641493c8c5fd0bba48be34804f6a9513f9152a48"} Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.653822 4837 scope.go:117] "RemoveContainer" containerID="ae22850200d2bdfeae131a3fc5f5d23362efead182fe2123e5b1872eff284a8c" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.653846 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.707960 4837 scope.go:117] "RemoveContainer" containerID="6a064235e84fee9beec50e8dee671551d37af489714ca51d8bf03fb94cc28c6f" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.709929 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.722290 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.740811 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741133 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-notification-agent" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741148 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-notification-agent" Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741183 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="sg-core" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741189 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="sg-core" Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741205 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb637136-1a3b-4d9c-a991-619e80c8cf31" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741212 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb637136-1a3b-4d9c-a991-619e80c8cf31" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741225 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b359e514-29e9-456b-814d-7e86c9f18e4c" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741232 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b359e514-29e9-456b-814d-7e86c9f18e4c" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741252 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="proxy-httpd" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741258 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="proxy-httpd" Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741268 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-central-agent" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741274 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-central-agent" Oct 14 13:19:28 crc kubenswrapper[4837]: E1014 13:19:28.741294 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3ab4e0-d484-4d59-a19d-c6c3c197542f" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741299 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3ab4e0-d484-4d59-a19d-c6c3c197542f" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741461 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="proxy-httpd" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741479 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b359e514-29e9-456b-814d-7e86c9f18e4c" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741489 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-central-agent" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741500 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="ceilometer-notification-agent" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741510 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3ab4e0-d484-4d59-a19d-c6c3c197542f" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741522 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb637136-1a3b-4d9c-a991-619e80c8cf31" containerName="mariadb-database-create" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.741530 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" containerName="sg-core" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.742999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.745082 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.745313 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.748603 4837 scope.go:117] "RemoveContainer" containerID="da68eb6eb43ca1a895d56ce7df284447bbfa84020d2a5e6c6690446e84a95f1c" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.758011 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.777391 4837 scope.go:117] "RemoveContainer" containerID="c4befa47898e817cd576ef141eee4e99c8139b17cefd80fe90c48fc5e1a4cfcf" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.801886 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8329e863-be1f-4eee-a002-5a1ed17195ac" path="/var/lib/kubelet/pods/8329e863-be1f-4eee-a002-5a1ed17195ac/volumes" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.884977 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t644\" (UniqueName: \"kubernetes.io/projected/c0d83934-712b-443a-8a0c-ddf235432984-kube-api-access-6t644\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.885048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.885086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-scripts\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.885107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.885180 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-log-httpd\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.885207 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-config-data\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.885220 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-run-httpd\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.902840 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.986502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-scripts\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.986831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.986927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-log-httpd\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.986959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-config-data\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.986974 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-run-httpd\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.987044 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t644\" (UniqueName: \"kubernetes.io/projected/c0d83934-712b-443a-8a0c-ddf235432984-kube-api-access-6t644\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.987089 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.988866 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-run-httpd\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.989561 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-log-httpd\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.991112 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:28 crc kubenswrapper[4837]: I1014 13:19:28.992746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-scripts\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.006504 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.006805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-config-data\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.009226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t644\" (UniqueName: \"kubernetes.io/projected/c0d83934-712b-443a-8a0c-ddf235432984-kube-api-access-6t644\") pod \"ceilometer-0\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " pod="openstack/ceilometer-0" Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.072433 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.530199 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:29 crc kubenswrapper[4837]: W1014 13:19:29.537203 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0d83934_712b_443a_8a0c_ddf235432984.slice/crio-47169b97e388fb623b15eda4928befeafb116ba267b2dd82fca4f7a60ee330ac WatchSource:0}: Error finding container 47169b97e388fb623b15eda4928befeafb116ba267b2dd82fca4f7a60ee330ac: Status 404 returned error can't find the container with id 47169b97e388fb623b15eda4928befeafb116ba267b2dd82fca4f7a60ee330ac Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.665577 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerStarted","Data":"47169b97e388fb623b15eda4928befeafb116ba267b2dd82fca4f7a60ee330ac"} Oct 14 13:19:29 crc kubenswrapper[4837]: I1014 13:19:29.801851 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:19:30 crc kubenswrapper[4837]: I1014 13:19:30.676739 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerStarted","Data":"ff38ed7e0105f9483a17e29758cf5aef2508f4363880a5520a8c6216a6a3b04b"} Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.025906 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-23f1-account-create-2lvq2"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.032960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.036488 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.044837 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-23f1-account-create-2lvq2"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.123609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrcj\" (UniqueName: \"kubernetes.io/projected/151a9907-f0f3-424b-ab8d-59072c705b8b-kube-api-access-ktrcj\") pod \"nova-api-23f1-account-create-2lvq2\" (UID: \"151a9907-f0f3-424b-ab8d-59072c705b8b\") " pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.152505 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f7c5db7df-7tsqg" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.213240 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db89896b8-g9kwv"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.213532 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-db89896b8-g9kwv" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-api" containerID="cri-o://bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a" gracePeriod=30 Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.213708 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-db89896b8-g9kwv" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-httpd" containerID="cri-o://dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37" gracePeriod=30 Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.227186 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrcj\" (UniqueName: \"kubernetes.io/projected/151a9907-f0f3-424b-ab8d-59072c705b8b-kube-api-access-ktrcj\") pod \"nova-api-23f1-account-create-2lvq2\" (UID: \"151a9907-f0f3-424b-ab8d-59072c705b8b\") " pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.238784 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-87c3-account-create-dc67f"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.239859 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.242377 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.251606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrcj\" (UniqueName: \"kubernetes.io/projected/151a9907-f0f3-424b-ab8d-59072c705b8b-kube-api-access-ktrcj\") pod \"nova-api-23f1-account-create-2lvq2\" (UID: \"151a9907-f0f3-424b-ab8d-59072c705b8b\") " pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.253588 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-87c3-account-create-dc67f"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.329699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk6w6\" (UniqueName: \"kubernetes.io/projected/d58df43b-87d6-4bf3-ae5f-1eba933a068e-kube-api-access-sk6w6\") pod \"nova-cell0-87c3-account-create-dc67f\" (UID: \"d58df43b-87d6-4bf3-ae5f-1eba933a068e\") " pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.379433 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.421716 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8b25-account-create-x7kg4"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.423440 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.434813 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.442373 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk6w6\" (UniqueName: \"kubernetes.io/projected/d58df43b-87d6-4bf3-ae5f-1eba933a068e-kube-api-access-sk6w6\") pod \"nova-cell0-87c3-account-create-dc67f\" (UID: \"d58df43b-87d6-4bf3-ae5f-1eba933a068e\") " pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.450724 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8b25-account-create-x7kg4"] Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.481375 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk6w6\" (UniqueName: \"kubernetes.io/projected/d58df43b-87d6-4bf3-ae5f-1eba933a068e-kube-api-access-sk6w6\") pod \"nova-cell0-87c3-account-create-dc67f\" (UID: \"d58df43b-87d6-4bf3-ae5f-1eba933a068e\") " pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.549811 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ww6\" (UniqueName: \"kubernetes.io/projected/e5c11a38-420a-42a5-b9f7-08785e1c1342-kube-api-access-44ww6\") pod \"nova-cell1-8b25-account-create-x7kg4\" (UID: \"e5c11a38-420a-42a5-b9f7-08785e1c1342\") " pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.652269 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ww6\" (UniqueName: \"kubernetes.io/projected/e5c11a38-420a-42a5-b9f7-08785e1c1342-kube-api-access-44ww6\") pod \"nova-cell1-8b25-account-create-x7kg4\" (UID: \"e5c11a38-420a-42a5-b9f7-08785e1c1342\") " pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.660881 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.672130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ww6\" (UniqueName: \"kubernetes.io/projected/e5c11a38-420a-42a5-b9f7-08785e1c1342-kube-api-access-44ww6\") pod \"nova-cell1-8b25-account-create-x7kg4\" (UID: \"e5c11a38-420a-42a5-b9f7-08785e1c1342\") " pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.689699 4837 generic.go:334] "Generic (PLEG): container finished" podID="67361ebb-8351-4449-a97f-26f22d4263cc" containerID="dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37" exitCode=0 Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.689777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db89896b8-g9kwv" event={"ID":"67361ebb-8351-4449-a97f-26f22d4263cc","Type":"ContainerDied","Data":"dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37"} Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.698249 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerStarted","Data":"a5de6b4b42016b0b7b1facb19d8128ddf02550d5f0f8c1ec9f5d314e7ea68891"} Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.776613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:31 crc kubenswrapper[4837]: I1014 13:19:31.902955 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-23f1-account-create-2lvq2"] Oct 14 13:19:31 crc kubenswrapper[4837]: W1014 13:19:31.910194 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151a9907_f0f3_424b_ab8d_59072c705b8b.slice/crio-d8bb8f91c02908da54da536da94c8d66a3365d7fe70d6987ad26002e96af9ec9 WatchSource:0}: Error finding container d8bb8f91c02908da54da536da94c8d66a3365d7fe70d6987ad26002e96af9ec9: Status 404 returned error can't find the container with id d8bb8f91c02908da54da536da94c8d66a3365d7fe70d6987ad26002e96af9ec9 Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.140594 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-87c3-account-create-dc67f"] Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.335856 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8b25-account-create-x7kg4"] Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.710655 4837 generic.go:334] "Generic (PLEG): container finished" podID="e5c11a38-420a-42a5-b9f7-08785e1c1342" containerID="424ae7873ca431a6b35de5724c85298f676e413955818457d6e8ce3dfdde5dec" exitCode=0 Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.710920 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b25-account-create-x7kg4" event={"ID":"e5c11a38-420a-42a5-b9f7-08785e1c1342","Type":"ContainerDied","Data":"424ae7873ca431a6b35de5724c85298f676e413955818457d6e8ce3dfdde5dec"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.711025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b25-account-create-x7kg4" event={"ID":"e5c11a38-420a-42a5-b9f7-08785e1c1342","Type":"ContainerStarted","Data":"77e14d0a3f0379b5df80d1f21c980da911b95813698b49dcd297f025f2cf90f9"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.713651 4837 generic.go:334] "Generic (PLEG): container finished" podID="151a9907-f0f3-424b-ab8d-59072c705b8b" containerID="3a4a1c4646135f719cc83377c2dd7ecb91cce3e4dd405c8c0e7399f6a8e1c0f0" exitCode=0 Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.713725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23f1-account-create-2lvq2" event={"ID":"151a9907-f0f3-424b-ab8d-59072c705b8b","Type":"ContainerDied","Data":"3a4a1c4646135f719cc83377c2dd7ecb91cce3e4dd405c8c0e7399f6a8e1c0f0"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.713763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23f1-account-create-2lvq2" event={"ID":"151a9907-f0f3-424b-ab8d-59072c705b8b","Type":"ContainerStarted","Data":"d8bb8f91c02908da54da536da94c8d66a3365d7fe70d6987ad26002e96af9ec9"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.718649 4837 generic.go:334] "Generic (PLEG): container finished" podID="d58df43b-87d6-4bf3-ae5f-1eba933a068e" containerID="4fe29d7164af95d4644a047c4ca450031224e03d56c6336aac62af6caba68c80" exitCode=0 Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.718735 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-87c3-account-create-dc67f" event={"ID":"d58df43b-87d6-4bf3-ae5f-1eba933a068e","Type":"ContainerDied","Data":"4fe29d7164af95d4644a047c4ca450031224e03d56c6336aac62af6caba68c80"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.718791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-87c3-account-create-dc67f" event={"ID":"d58df43b-87d6-4bf3-ae5f-1eba933a068e","Type":"ContainerStarted","Data":"fec84b33b9fc88da3ff5d8d7705994681e03ec19345d59586794dab448120e50"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.721141 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerStarted","Data":"b0dd8f8594539ffcea85b5ac50119491801792af7c2815da3041e9ef00374898"} Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.973498 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 14 13:19:32 crc kubenswrapper[4837]: I1014 13:19:32.973552 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.015448 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.021845 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.625563 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.732819 4837 generic.go:334] "Generic (PLEG): container finished" podID="67361ebb-8351-4449-a97f-26f22d4263cc" containerID="bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a" exitCode=0 Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.732879 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db89896b8-g9kwv" event={"ID":"67361ebb-8351-4449-a97f-26f22d4263cc","Type":"ContainerDied","Data":"bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a"} Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.732911 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db89896b8-g9kwv" event={"ID":"67361ebb-8351-4449-a97f-26f22d4263cc","Type":"ContainerDied","Data":"3e1ce38c541b5c39fd5fa61cf923a29e24e5c1b8ff798d2fd076cbad1e470155"} Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.732917 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db89896b8-g9kwv" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.732927 4837 scope.go:117] "RemoveContainer" containerID="dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.738996 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-central-agent" containerID="cri-o://ff38ed7e0105f9483a17e29758cf5aef2508f4363880a5520a8c6216a6a3b04b" gracePeriod=30 Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.739707 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerStarted","Data":"96d43743cd76f95ace7be8b1398bb2193a3235bd1979284123f52765e5285c92"} Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.741913 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="proxy-httpd" containerID="cri-o://96d43743cd76f95ace7be8b1398bb2193a3235bd1979284123f52765e5285c92" gracePeriod=30 Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.742073 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="sg-core" containerID="cri-o://b0dd8f8594539ffcea85b5ac50119491801792af7c2815da3041e9ef00374898" gracePeriod=30 Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.742203 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-notification-agent" containerID="cri-o://a5de6b4b42016b0b7b1facb19d8128ddf02550d5f0f8c1ec9f5d314e7ea68891" gracePeriod=30 Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.742578 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.742616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.742629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.759795 4837 scope.go:117] "RemoveContainer" containerID="bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.769808 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.86922537 podStartE2EDuration="5.769786167s" podCreationTimestamp="2025-10-14 13:19:28 +0000 UTC" firstStartedPulling="2025-10-14 13:19:29.539338843 +0000 UTC m=+1107.456338656" lastFinishedPulling="2025-10-14 13:19:33.43989965 +0000 UTC m=+1111.356899453" observedRunningTime="2025-10-14 13:19:33.767980998 +0000 UTC m=+1111.684980821" watchObservedRunningTime="2025-10-14 13:19:33.769786167 +0000 UTC m=+1111.686785980" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.792871 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-httpd-config\") pod \"67361ebb-8351-4449-a97f-26f22d4263cc\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.793087 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ktn\" (UniqueName: \"kubernetes.io/projected/67361ebb-8351-4449-a97f-26f22d4263cc-kube-api-access-b9ktn\") pod \"67361ebb-8351-4449-a97f-26f22d4263cc\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.793136 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-config\") pod \"67361ebb-8351-4449-a97f-26f22d4263cc\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.793229 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-ovndb-tls-certs\") pod \"67361ebb-8351-4449-a97f-26f22d4263cc\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.793252 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-combined-ca-bundle\") pod \"67361ebb-8351-4449-a97f-26f22d4263cc\" (UID: \"67361ebb-8351-4449-a97f-26f22d4263cc\") " Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.796096 4837 scope.go:117] "RemoveContainer" containerID="dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37" Oct 14 13:19:33 crc kubenswrapper[4837]: E1014 13:19:33.802189 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37\": container with ID starting with dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37 not found: ID does not exist" containerID="dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.802437 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37"} err="failed to get container status \"dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37\": rpc error: code = NotFound desc = could not find container \"dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37\": container with ID starting with dfe40270692c50aaa1d9dc6761e4ed2a5caccc8e0fd1052af0d5ff78ff904c37 not found: ID does not exist" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.802569 4837 scope.go:117] "RemoveContainer" containerID="bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.802833 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "67361ebb-8351-4449-a97f-26f22d4263cc" (UID: "67361ebb-8351-4449-a97f-26f22d4263cc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:33 crc kubenswrapper[4837]: E1014 13:19:33.803608 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a\": container with ID starting with bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a not found: ID does not exist" containerID="bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.803665 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a"} err="failed to get container status \"bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a\": rpc error: code = NotFound desc = could not find container \"bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a\": container with ID starting with bd617e612e5a93e1cf2ccaaf48341d23c0109784e60d4e382f41f4297e7e7d2a not found: ID does not exist" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.834907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67361ebb-8351-4449-a97f-26f22d4263cc-kube-api-access-b9ktn" (OuterVolumeSpecName: "kube-api-access-b9ktn") pod "67361ebb-8351-4449-a97f-26f22d4263cc" (UID: "67361ebb-8351-4449-a97f-26f22d4263cc"). InnerVolumeSpecName "kube-api-access-b9ktn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.866693 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67361ebb-8351-4449-a97f-26f22d4263cc" (UID: "67361ebb-8351-4449-a97f-26f22d4263cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.896133 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ktn\" (UniqueName: \"kubernetes.io/projected/67361ebb-8351-4449-a97f-26f22d4263cc-kube-api-access-b9ktn\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.896191 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.896203 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.927892 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-config" (OuterVolumeSpecName: "config") pod "67361ebb-8351-4449-a97f-26f22d4263cc" (UID: "67361ebb-8351-4449-a97f-26f22d4263cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.961229 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "67361ebb-8351-4449-a97f-26f22d4263cc" (UID: "67361ebb-8351-4449-a97f-26f22d4263cc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.997674 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:33 crc kubenswrapper[4837]: I1014 13:19:33.997737 4837 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67361ebb-8351-4449-a97f-26f22d4263cc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.047389 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.047429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.049962 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.065431 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db89896b8-g9kwv"] Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.070929 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db89896b8-g9kwv"] Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.092979 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.120218 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.204315 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktrcj\" (UniqueName: \"kubernetes.io/projected/151a9907-f0f3-424b-ab8d-59072c705b8b-kube-api-access-ktrcj\") pod \"151a9907-f0f3-424b-ab8d-59072c705b8b\" (UID: \"151a9907-f0f3-424b-ab8d-59072c705b8b\") " Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.210793 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151a9907-f0f3-424b-ab8d-59072c705b8b-kube-api-access-ktrcj" (OuterVolumeSpecName: "kube-api-access-ktrcj") pod "151a9907-f0f3-424b-ab8d-59072c705b8b" (UID: "151a9907-f0f3-424b-ab8d-59072c705b8b"). InnerVolumeSpecName "kube-api-access-ktrcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.305811 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktrcj\" (UniqueName: \"kubernetes.io/projected/151a9907-f0f3-424b-ab8d-59072c705b8b-kube-api-access-ktrcj\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.309558 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.317789 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.406754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk6w6\" (UniqueName: \"kubernetes.io/projected/d58df43b-87d6-4bf3-ae5f-1eba933a068e-kube-api-access-sk6w6\") pod \"d58df43b-87d6-4bf3-ae5f-1eba933a068e\" (UID: \"d58df43b-87d6-4bf3-ae5f-1eba933a068e\") " Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.406937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44ww6\" (UniqueName: \"kubernetes.io/projected/e5c11a38-420a-42a5-b9f7-08785e1c1342-kube-api-access-44ww6\") pod \"e5c11a38-420a-42a5-b9f7-08785e1c1342\" (UID: \"e5c11a38-420a-42a5-b9f7-08785e1c1342\") " Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.410890 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c11a38-420a-42a5-b9f7-08785e1c1342-kube-api-access-44ww6" (OuterVolumeSpecName: "kube-api-access-44ww6") pod "e5c11a38-420a-42a5-b9f7-08785e1c1342" (UID: "e5c11a38-420a-42a5-b9f7-08785e1c1342"). InnerVolumeSpecName "kube-api-access-44ww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.412525 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58df43b-87d6-4bf3-ae5f-1eba933a068e-kube-api-access-sk6w6" (OuterVolumeSpecName: "kube-api-access-sk6w6") pod "d58df43b-87d6-4bf3-ae5f-1eba933a068e" (UID: "d58df43b-87d6-4bf3-ae5f-1eba933a068e"). InnerVolumeSpecName "kube-api-access-sk6w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.508915 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44ww6\" (UniqueName: \"kubernetes.io/projected/e5c11a38-420a-42a5-b9f7-08785e1c1342-kube-api-access-44ww6\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.508957 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk6w6\" (UniqueName: \"kubernetes.io/projected/d58df43b-87d6-4bf3-ae5f-1eba933a068e-kube-api-access-sk6w6\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.749618 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0d83934-712b-443a-8a0c-ddf235432984" containerID="b0dd8f8594539ffcea85b5ac50119491801792af7c2815da3041e9ef00374898" exitCode=2 Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.749652 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0d83934-712b-443a-8a0c-ddf235432984" containerID="a5de6b4b42016b0b7b1facb19d8128ddf02550d5f0f8c1ec9f5d314e7ea68891" exitCode=0 Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.749679 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerDied","Data":"b0dd8f8594539ffcea85b5ac50119491801792af7c2815da3041e9ef00374898"} Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.750170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerDied","Data":"a5de6b4b42016b0b7b1facb19d8128ddf02550d5f0f8c1ec9f5d314e7ea68891"} Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.751268 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b25-account-create-x7kg4" event={"ID":"e5c11a38-420a-42a5-b9f7-08785e1c1342","Type":"ContainerDied","Data":"77e14d0a3f0379b5df80d1f21c980da911b95813698b49dcd297f025f2cf90f9"} Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.751403 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77e14d0a3f0379b5df80d1f21c980da911b95813698b49dcd297f025f2cf90f9" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.751318 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b25-account-create-x7kg4" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.753076 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23f1-account-create-2lvq2" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.753751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23f1-account-create-2lvq2" event={"ID":"151a9907-f0f3-424b-ab8d-59072c705b8b","Type":"ContainerDied","Data":"d8bb8f91c02908da54da536da94c8d66a3365d7fe70d6987ad26002e96af9ec9"} Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.753780 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8bb8f91c02908da54da536da94c8d66a3365d7fe70d6987ad26002e96af9ec9" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.755437 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-87c3-account-create-dc67f" event={"ID":"d58df43b-87d6-4bf3-ae5f-1eba933a068e","Type":"ContainerDied","Data":"fec84b33b9fc88da3ff5d8d7705994681e03ec19345d59586794dab448120e50"} Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.755467 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fec84b33b9fc88da3ff5d8d7705994681e03ec19345d59586794dab448120e50" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.755519 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-87c3-account-create-dc67f" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.760587 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.761191 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:34 crc kubenswrapper[4837]: I1014 13:19:34.801868 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" path="/var/lib/kubelet/pods/67361ebb-8351-4449-a97f-26f22d4263cc/volumes" Oct 14 13:19:35 crc kubenswrapper[4837]: I1014 13:19:35.770614 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:19:35 crc kubenswrapper[4837]: I1014 13:19:35.770915 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:19:35 crc kubenswrapper[4837]: I1014 13:19:35.801944 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 14 13:19:35 crc kubenswrapper[4837]: I1014 13:19:35.810633 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.489723 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zmvhn"] Oct 14 13:19:36 crc kubenswrapper[4837]: E1014 13:19:36.490075 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58df43b-87d6-4bf3-ae5f-1eba933a068e" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490091 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58df43b-87d6-4bf3-ae5f-1eba933a068e" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: E1014 13:19:36.490100 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151a9907-f0f3-424b-ab8d-59072c705b8b" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490107 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="151a9907-f0f3-424b-ab8d-59072c705b8b" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: E1014 13:19:36.490124 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-api" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490131 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-api" Oct 14 13:19:36 crc kubenswrapper[4837]: E1014 13:19:36.490182 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-httpd" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490189 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-httpd" Oct 14 13:19:36 crc kubenswrapper[4837]: E1014 13:19:36.490200 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c11a38-420a-42a5-b9f7-08785e1c1342" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490207 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c11a38-420a-42a5-b9f7-08785e1c1342" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490374 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-httpd" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490389 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="67361ebb-8351-4449-a97f-26f22d4263cc" containerName="neutron-api" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490402 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c11a38-420a-42a5-b9f7-08785e1c1342" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490415 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58df43b-87d6-4bf3-ae5f-1eba933a068e" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490426 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="151a9907-f0f3-424b-ab8d-59072c705b8b" containerName="mariadb-account-create" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.490963 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.492790 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.494662 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mmjtp" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.494845 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.500112 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zmvhn"] Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.547747 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.547819 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-config-data\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.547863 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-scripts\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.547935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wdpm\" (UniqueName: \"kubernetes.io/projected/0a699c63-862d-41a9-9dbd-5d81978c7985-kube-api-access-7wdpm\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.649291 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wdpm\" (UniqueName: \"kubernetes.io/projected/0a699c63-862d-41a9-9dbd-5d81978c7985-kube-api-access-7wdpm\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.649592 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.649757 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-config-data\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.649913 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-scripts\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.655434 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.655624 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-config-data\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.655862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-scripts\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.667571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wdpm\" (UniqueName: \"kubernetes.io/projected/0a699c63-862d-41a9-9dbd-5d81978c7985-kube-api-access-7wdpm\") pod \"nova-cell0-conductor-db-sync-zmvhn\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.780258 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.780540 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.781944 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.794688 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 14 13:19:36 crc kubenswrapper[4837]: I1014 13:19:36.818441 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:37 crc kubenswrapper[4837]: I1014 13:19:37.371384 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zmvhn"] Oct 14 13:19:37 crc kubenswrapper[4837]: I1014 13:19:37.822485 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" event={"ID":"0a699c63-862d-41a9-9dbd-5d81978c7985","Type":"ContainerStarted","Data":"93f0f4a38b3465fdbe41a09cf8e55eb6f5a8dc25b5cd5b50cb380b6c66c576a0"} Oct 14 13:19:38 crc kubenswrapper[4837]: I1014 13:19:38.837368 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0d83934-712b-443a-8a0c-ddf235432984" containerID="ff38ed7e0105f9483a17e29758cf5aef2508f4363880a5520a8c6216a6a3b04b" exitCode=0 Oct 14 13:19:38 crc kubenswrapper[4837]: I1014 13:19:38.837414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerDied","Data":"ff38ed7e0105f9483a17e29758cf5aef2508f4363880a5520a8c6216a6a3b04b"} Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.139670 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.140043 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.140093 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.140924 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.141016 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d" gracePeriod=600 Oct 14 13:19:41 crc kubenswrapper[4837]: E1014 13:19:41.286438 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ba7fa6_d0a5_4e80_a6e4_33d7ce2081d3.slice/crio-8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d.scope\": RecentStats: unable to find data in memory cache]" Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.884812 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d" exitCode=0 Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.884859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d"} Oct 14 13:19:41 crc kubenswrapper[4837]: I1014 13:19:41.884890 4837 scope.go:117] "RemoveContainer" containerID="2f7061072f040d06169aa6c27c24b779e700a974c29cdb9d45439f3b10ea132d" Oct 14 13:19:44 crc kubenswrapper[4837]: I1014 13:19:44.912485 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" event={"ID":"0a699c63-862d-41a9-9dbd-5d81978c7985","Type":"ContainerStarted","Data":"cd138eec01a6fc44ee03eecdf3f58ae0230093cfa3a792114672df5342df5a10"} Oct 14 13:19:44 crc kubenswrapper[4837]: I1014 13:19:44.914606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"c044b8ea9bc069679094c7a3872ef16c9931631e466b1c5d874f80fa606522e9"} Oct 14 13:19:44 crc kubenswrapper[4837]: I1014 13:19:44.947175 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" podStartSLOduration=1.808092434 podStartE2EDuration="8.947135226s" podCreationTimestamp="2025-10-14 13:19:36 +0000 UTC" firstStartedPulling="2025-10-14 13:19:37.359522889 +0000 UTC m=+1115.276522702" lastFinishedPulling="2025-10-14 13:19:44.498565681 +0000 UTC m=+1122.415565494" observedRunningTime="2025-10-14 13:19:44.929655668 +0000 UTC m=+1122.846655481" watchObservedRunningTime="2025-10-14 13:19:44.947135226 +0000 UTC m=+1122.864135029" Oct 14 13:19:52 crc kubenswrapper[4837]: I1014 13:19:52.719897 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 14 13:19:52 crc kubenswrapper[4837]: I1014 13:19:52.721065 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e3b427fc-538b-4823-8ef3-8bab1765faee" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 14 13:19:57 crc kubenswrapper[4837]: I1014 13:19:57.031504 4837 generic.go:334] "Generic (PLEG): container finished" podID="0a699c63-862d-41a9-9dbd-5d81978c7985" containerID="cd138eec01a6fc44ee03eecdf3f58ae0230093cfa3a792114672df5342df5a10" exitCode=0 Oct 14 13:19:57 crc kubenswrapper[4837]: I1014 13:19:57.031595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" event={"ID":"0a699c63-862d-41a9-9dbd-5d81978c7985","Type":"ContainerDied","Data":"cd138eec01a6fc44ee03eecdf3f58ae0230093cfa3a792114672df5342df5a10"} Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.388103 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.438652 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wdpm\" (UniqueName: \"kubernetes.io/projected/0a699c63-862d-41a9-9dbd-5d81978c7985-kube-api-access-7wdpm\") pod \"0a699c63-862d-41a9-9dbd-5d81978c7985\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.438747 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-scripts\") pod \"0a699c63-862d-41a9-9dbd-5d81978c7985\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.438905 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-combined-ca-bundle\") pod \"0a699c63-862d-41a9-9dbd-5d81978c7985\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.438953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-config-data\") pod \"0a699c63-862d-41a9-9dbd-5d81978c7985\" (UID: \"0a699c63-862d-41a9-9dbd-5d81978c7985\") " Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.444607 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-scripts" (OuterVolumeSpecName: "scripts") pod "0a699c63-862d-41a9-9dbd-5d81978c7985" (UID: "0a699c63-862d-41a9-9dbd-5d81978c7985"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.447489 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a699c63-862d-41a9-9dbd-5d81978c7985-kube-api-access-7wdpm" (OuterVolumeSpecName: "kube-api-access-7wdpm") pod "0a699c63-862d-41a9-9dbd-5d81978c7985" (UID: "0a699c63-862d-41a9-9dbd-5d81978c7985"). InnerVolumeSpecName "kube-api-access-7wdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.478533 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a699c63-862d-41a9-9dbd-5d81978c7985" (UID: "0a699c63-862d-41a9-9dbd-5d81978c7985"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.480422 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-config-data" (OuterVolumeSpecName: "config-data") pod "0a699c63-862d-41a9-9dbd-5d81978c7985" (UID: "0a699c63-862d-41a9-9dbd-5d81978c7985"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.541301 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.541343 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.541357 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a699c63-862d-41a9-9dbd-5d81978c7985-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:58 crc kubenswrapper[4837]: I1014 13:19:58.541368 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wdpm\" (UniqueName: \"kubernetes.io/projected/0a699c63-862d-41a9-9dbd-5d81978c7985-kube-api-access-7wdpm\") on node \"crc\" DevicePath \"\"" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.050755 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" event={"ID":"0a699c63-862d-41a9-9dbd-5d81978c7985","Type":"ContainerDied","Data":"93f0f4a38b3465fdbe41a09cf8e55eb6f5a8dc25b5cd5b50cb380b6c66c576a0"} Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.050800 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93f0f4a38b3465fdbe41a09cf8e55eb6f5a8dc25b5cd5b50cb380b6c66c576a0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.050881 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zmvhn" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.076512 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.162253 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 14 13:19:59 crc kubenswrapper[4837]: E1014 13:19:59.162742 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a699c63-862d-41a9-9dbd-5d81978c7985" containerName="nova-cell0-conductor-db-sync" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.162764 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a699c63-862d-41a9-9dbd-5d81978c7985" containerName="nova-cell0-conductor-db-sync" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.163004 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a699c63-862d-41a9-9dbd-5d81978c7985" containerName="nova-cell0-conductor-db-sync" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.163755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.165555 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mmjtp" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.167522 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.173875 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.253557 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd40b584-fe33-48fa-a09b-e50f7b40f785-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.253685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd40b584-fe33-48fa-a09b-e50f7b40f785-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.253929 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9vg\" (UniqueName: \"kubernetes.io/projected/bd40b584-fe33-48fa-a09b-e50f7b40f785-kube-api-access-vk9vg\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.356254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd40b584-fe33-48fa-a09b-e50f7b40f785-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.356369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd40b584-fe33-48fa-a09b-e50f7b40f785-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.356462 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9vg\" (UniqueName: \"kubernetes.io/projected/bd40b584-fe33-48fa-a09b-e50f7b40f785-kube-api-access-vk9vg\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.360901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd40b584-fe33-48fa-a09b-e50f7b40f785-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.364884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd40b584-fe33-48fa-a09b-e50f7b40f785-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.373298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9vg\" (UniqueName: \"kubernetes.io/projected/bd40b584-fe33-48fa-a09b-e50f7b40f785-kube-api-access-vk9vg\") pod \"nova-cell0-conductor-0\" (UID: \"bd40b584-fe33-48fa-a09b-e50f7b40f785\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.480485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 14 13:19:59 crc kubenswrapper[4837]: I1014 13:19:59.921536 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 14 13:20:00 crc kubenswrapper[4837]: I1014 13:20:00.061101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bd40b584-fe33-48fa-a09b-e50f7b40f785","Type":"ContainerStarted","Data":"d2a9d2bae07209e2d7a46f18a4f8e23dd0a01928bb0218bbc87b7a845ed49fe0"} Oct 14 13:20:01 crc kubenswrapper[4837]: I1014 13:20:01.070309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bd40b584-fe33-48fa-a09b-e50f7b40f785","Type":"ContainerStarted","Data":"3ce9597ad424a4b6a6d09cfbbfbc266ce2eb3e40a5810a792712b74d799c6b0b"} Oct 14 13:20:01 crc kubenswrapper[4837]: I1014 13:20:01.071702 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 14 13:20:01 crc kubenswrapper[4837]: I1014 13:20:01.089628 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.089608678 podStartE2EDuration="2.089608678s" podCreationTimestamp="2025-10-14 13:19:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:01.085829637 +0000 UTC m=+1139.002829470" watchObservedRunningTime="2025-10-14 13:20:01.089608678 +0000 UTC m=+1139.006608501" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.108074 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0d83934-712b-443a-8a0c-ddf235432984" containerID="96d43743cd76f95ace7be8b1398bb2193a3235bd1979284123f52765e5285c92" exitCode=137 Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.108179 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerDied","Data":"96d43743cd76f95ace7be8b1398bb2193a3235bd1979284123f52765e5285c92"} Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.219946 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350322 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t644\" (UniqueName: \"kubernetes.io/projected/c0d83934-712b-443a-8a0c-ddf235432984-kube-api-access-6t644\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350380 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-run-httpd\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-sg-core-conf-yaml\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350447 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-log-httpd\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-scripts\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-combined-ca-bundle\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.350614 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-config-data\") pod \"c0d83934-712b-443a-8a0c-ddf235432984\" (UID: \"c0d83934-712b-443a-8a0c-ddf235432984\") " Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.351143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.351843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.356677 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d83934-712b-443a-8a0c-ddf235432984-kube-api-access-6t644" (OuterVolumeSpecName: "kube-api-access-6t644") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "kube-api-access-6t644". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.358491 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-scripts" (OuterVolumeSpecName: "scripts") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.384817 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.437881 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.452955 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.452991 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.453005 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.453057 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.453070 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t644\" (UniqueName: \"kubernetes.io/projected/c0d83934-712b-443a-8a0c-ddf235432984-kube-api-access-6t644\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.453083 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0d83934-712b-443a-8a0c-ddf235432984-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.477376 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-config-data" (OuterVolumeSpecName: "config-data") pod "c0d83934-712b-443a-8a0c-ddf235432984" (UID: "c0d83934-712b-443a-8a0c-ddf235432984"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:04 crc kubenswrapper[4837]: I1014 13:20:04.554488 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0d83934-712b-443a-8a0c-ddf235432984-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.121822 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0d83934-712b-443a-8a0c-ddf235432984","Type":"ContainerDied","Data":"47169b97e388fb623b15eda4928befeafb116ba267b2dd82fca4f7a60ee330ac"} Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.121917 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.122220 4837 scope.go:117] "RemoveContainer" containerID="96d43743cd76f95ace7be8b1398bb2193a3235bd1979284123f52765e5285c92" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.145032 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.146150 4837 scope.go:117] "RemoveContainer" containerID="b0dd8f8594539ffcea85b5ac50119491801792af7c2815da3041e9ef00374898" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.168369 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.173130 4837 scope.go:117] "RemoveContainer" containerID="a5de6b4b42016b0b7b1facb19d8128ddf02550d5f0f8c1ec9f5d314e7ea68891" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.176171 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:05 crc kubenswrapper[4837]: E1014 13:20:05.176560 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-central-agent" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.176575 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-central-agent" Oct 14 13:20:05 crc kubenswrapper[4837]: E1014 13:20:05.176605 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="sg-core" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.176611 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="sg-core" Oct 14 13:20:05 crc kubenswrapper[4837]: E1014 13:20:05.176624 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="proxy-httpd" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.176630 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="proxy-httpd" Oct 14 13:20:05 crc kubenswrapper[4837]: E1014 13:20:05.176647 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-notification-agent" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.176653 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-notification-agent" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.177717 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-central-agent" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.177853 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="sg-core" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.177873 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="ceilometer-notification-agent" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.177886 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d83934-712b-443a-8a0c-ddf235432984" containerName="proxy-httpd" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.179574 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.186856 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.188777 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.188861 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.200565 4837 scope.go:117] "RemoveContainer" containerID="ff38ed7e0105f9483a17e29758cf5aef2508f4363880a5520a8c6216a6a3b04b" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267501 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-run-httpd\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267591 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-config-data\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267634 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-log-httpd\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267710 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-scripts\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267755 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.267779 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmr7\" (UniqueName: \"kubernetes.io/projected/cd31b711-5af6-4b4e-89eb-65085f9b88e6-kube-api-access-nxmr7\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370393 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmr7\" (UniqueName: \"kubernetes.io/projected/cd31b711-5af6-4b4e-89eb-65085f9b88e6-kube-api-access-nxmr7\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370523 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-run-httpd\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370651 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370691 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-config-data\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370729 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-log-httpd\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.370823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-scripts\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.372678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-run-httpd\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.372931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-log-httpd\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.375887 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-scripts\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.376370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.377372 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-config-data\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.377444 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.389397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmr7\" (UniqueName: \"kubernetes.io/projected/cd31b711-5af6-4b4e-89eb-65085f9b88e6-kube-api-access-nxmr7\") pod \"ceilometer-0\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.503843 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:05 crc kubenswrapper[4837]: I1014 13:20:05.969389 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:05 crc kubenswrapper[4837]: W1014 13:20:05.981331 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd31b711_5af6_4b4e_89eb_65085f9b88e6.slice/crio-5248fcbd5608431efa6376d7c89682a12e27eb28c50f3d532a1bd866fb827613 WatchSource:0}: Error finding container 5248fcbd5608431efa6376d7c89682a12e27eb28c50f3d532a1bd866fb827613: Status 404 returned error can't find the container with id 5248fcbd5608431efa6376d7c89682a12e27eb28c50f3d532a1bd866fb827613 Oct 14 13:20:06 crc kubenswrapper[4837]: I1014 13:20:06.131811 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerStarted","Data":"5248fcbd5608431efa6376d7c89682a12e27eb28c50f3d532a1bd866fb827613"} Oct 14 13:20:06 crc kubenswrapper[4837]: I1014 13:20:06.796328 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d83934-712b-443a-8a0c-ddf235432984" path="/var/lib/kubelet/pods/c0d83934-712b-443a-8a0c-ddf235432984/volumes" Oct 14 13:20:07 crc kubenswrapper[4837]: I1014 13:20:07.140206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerStarted","Data":"b2a459432b061fe738ecf8e213e4c4f7b47c965d4e77f61bf36ceed898c029ab"} Oct 14 13:20:08 crc kubenswrapper[4837]: I1014 13:20:08.148960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerStarted","Data":"af9c55a9b7255ca55cd99331f2151ad7f9390a493ad77c04f04643b6c5d20198"} Oct 14 13:20:09 crc kubenswrapper[4837]: I1014 13:20:09.159544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerStarted","Data":"50ddee8ada23b5c7cf4616d1026eaf7f820614399d32a171018078b4a94d9a4c"} Oct 14 13:20:09 crc kubenswrapper[4837]: I1014 13:20:09.507366 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.013013 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-k4v9z"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.014633 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.019020 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.019289 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.027607 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k4v9z"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.066296 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-config-data\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.066355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.066400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-scripts\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.066464 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mnh\" (UniqueName: \"kubernetes.io/projected/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-kube-api-access-d4mnh\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.169958 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-config-data\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.170010 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.170054 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-scripts\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.170102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mnh\" (UniqueName: \"kubernetes.io/projected/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-kube-api-access-d4mnh\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.177221 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-config-data\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.181085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.200432 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-scripts\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.217227 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.219132 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.225444 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.236483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mnh\" (UniqueName: \"kubernetes.io/projected/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-kube-api-access-d4mnh\") pod \"nova-cell0-cell-mapping-k4v9z\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.250721 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.257715 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.262088 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.267558 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.272075 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.272115 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-logs\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.272195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-config-data\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.272253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vfp2\" (UniqueName: \"kubernetes.io/projected/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-kube-api-access-7vfp2\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.294203 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.355246 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.360609 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.362709 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.365324 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-logs\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373640 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-config-data\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373668 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-config-data\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vfp2\" (UniqueName: \"kubernetes.io/projected/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-kube-api-access-7vfp2\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.373758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqjz\" (UniqueName: \"kubernetes.io/projected/9f670486-ce95-470b-abdc-42ca55378cc2-kube-api-access-nlqjz\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.379849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-logs\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.383911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-config-data\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.394795 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.441426 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.484488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vfp2\" (UniqueName: \"kubernetes.io/projected/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-kube-api-access-7vfp2\") pod \"nova-api-0\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490217 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-config-data\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490338 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62496296-571d-43a1-ba03-30e59f710293-logs\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlqjz\" (UniqueName: \"kubernetes.io/projected/9f670486-ce95-470b-abdc-42ca55378cc2-kube-api-access-nlqjz\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490663 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-config-data\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.490750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5dfs\" (UniqueName: \"kubernetes.io/projected/62496296-571d-43a1-ba03-30e59f710293-kube-api-access-w5dfs\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.494690 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-config-data\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.509902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.560736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlqjz\" (UniqueName: \"kubernetes.io/projected/9f670486-ce95-470b-abdc-42ca55378cc2-kube-api-access-nlqjz\") pod \"nova-scheduler-0\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.574045 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.583308 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.593128 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.593186 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-config-data\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.593214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5dfs\" (UniqueName: \"kubernetes.io/projected/62496296-571d-43a1-ba03-30e59f710293-kube-api-access-w5dfs\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.593284 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62496296-571d-43a1-ba03-30e59f710293-logs\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.593722 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62496296-571d-43a1-ba03-30e59f710293-logs\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.606493 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.612580 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.631177 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-config-data\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.645992 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.646584 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.651104 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.698062 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.698105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.698138 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mzq\" (UniqueName: \"kubernetes.io/projected/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-kube-api-access-n5mzq\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.715955 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5dfs\" (UniqueName: \"kubernetes.io/projected/62496296-571d-43a1-ba03-30e59f710293-kube-api-access-w5dfs\") pod \"nova-metadata-0\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.768447 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-np2rl"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.770605 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.798085 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.799203 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.799223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.799253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mzq\" (UniqueName: \"kubernetes.io/projected/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-kube-api-access-n5mzq\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.807857 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.815071 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.833456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mzq\" (UniqueName: \"kubernetes.io/projected/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-kube-api-access-n5mzq\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.895791 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-np2rl"] Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.900556 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.900646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4tk6\" (UniqueName: \"kubernetes.io/projected/5dd5cf86-df02-4983-86f2-430028f1d02d-kube-api-access-z4tk6\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.900792 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.900808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-config\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.900847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.900881 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:10 crc kubenswrapper[4837]: I1014 13:20:10.956601 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.002663 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.002709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-config\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.002753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.002789 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.002819 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.002855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4tk6\" (UniqueName: \"kubernetes.io/projected/5dd5cf86-df02-4983-86f2-430028f1d02d-kube-api-access-z4tk6\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.004108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.005252 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.005407 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-config\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.006013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.006032 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.021879 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4tk6\" (UniqueName: \"kubernetes.io/projected/5dd5cf86-df02-4983-86f2-430028f1d02d-kube-api-access-z4tk6\") pod \"dnsmasq-dns-865f5d856f-np2rl\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.160794 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.259636 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-k4v9z"] Oct 14 13:20:11 crc kubenswrapper[4837]: W1014 13:20:11.437206 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f670486_ce95_470b_abdc_42ca55378cc2.slice/crio-947e9670a6c1ceae1de2d84d754e445b68706d3eaa50aa7b52573081192dec69 WatchSource:0}: Error finding container 947e9670a6c1ceae1de2d84d754e445b68706d3eaa50aa7b52573081192dec69: Status 404 returned error can't find the container with id 947e9670a6c1ceae1de2d84d754e445b68706d3eaa50aa7b52573081192dec69 Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.440111 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.458539 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.549280 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:11 crc kubenswrapper[4837]: W1014 13:20:11.656921 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cb1b7f4_168e_48ec_a86f_58b2d40bdde3.slice/crio-87b7265f8e0a541f80847fe4eb0a638e33b791bf7d9e2d818c24ca87916429ba WatchSource:0}: Error finding container 87b7265f8e0a541f80847fe4eb0a638e33b791bf7d9e2d818c24ca87916429ba: Status 404 returned error can't find the container with id 87b7265f8e0a541f80847fe4eb0a638e33b791bf7d9e2d818c24ca87916429ba Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.657311 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.752975 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r9shz"] Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.754070 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.755668 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.757090 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.768011 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r9shz"] Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.826450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-scripts\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.826510 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-config-data\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.826553 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.826577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8rxc\" (UniqueName: \"kubernetes.io/projected/bc381d25-0a5e-438b-b2c8-45edc80148f4-kube-api-access-f8rxc\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.854401 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-np2rl"] Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.928688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-scripts\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.928775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-config-data\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.928843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.928896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8rxc\" (UniqueName: \"kubernetes.io/projected/bc381d25-0a5e-438b-b2c8-45edc80148f4-kube-api-access-f8rxc\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.934887 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.941546 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-config-data\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.941671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-scripts\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:11 crc kubenswrapper[4837]: I1014 13:20:11.957674 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8rxc\" (UniqueName: \"kubernetes.io/projected/bc381d25-0a5e-438b-b2c8-45edc80148f4-kube-api-access-f8rxc\") pod \"nova-cell1-conductor-db-sync-r9shz\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.076056 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.227352 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3","Type":"ContainerStarted","Data":"87b7265f8e0a541f80847fe4eb0a638e33b791bf7d9e2d818c24ca87916429ba"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.233494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k4v9z" event={"ID":"a6738cf5-e029-4393-9dcb-818a2e5ed0b3","Type":"ContainerStarted","Data":"443ff350c29b3303e59363224be1b5cd139d697736cb9756bc30bff5a1b8f62c"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.233541 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k4v9z" event={"ID":"a6738cf5-e029-4393-9dcb-818a2e5ed0b3","Type":"ContainerStarted","Data":"c1994f7536b6b2a0f4ab1bdd44cd0b6b38702fad0c30631fb713c52bccdc3bc2"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.237252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f670486-ce95-470b-abdc-42ca55378cc2","Type":"ContainerStarted","Data":"947e9670a6c1ceae1de2d84d754e445b68706d3eaa50aa7b52573081192dec69"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.242698 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731","Type":"ContainerStarted","Data":"08bc5b4f165879548a6ac4c7f6c7d47aceccf67a77cd877569d1c7b087754d15"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.274024 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-k4v9z" podStartSLOduration=3.274001876 podStartE2EDuration="3.274001876s" podCreationTimestamp="2025-10-14 13:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:12.267866241 +0000 UTC m=+1150.184866064" watchObservedRunningTime="2025-10-14 13:20:12.274001876 +0000 UTC m=+1150.191001679" Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.289853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerStarted","Data":"f6893e44a60a0237de489adef9ca8e9d742555369287b358ef14d7b245eba328"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.291531 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.312129 4837 generic.go:334] "Generic (PLEG): container finished" podID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerID="fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88" exitCode=0 Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.312290 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" event={"ID":"5dd5cf86-df02-4983-86f2-430028f1d02d","Type":"ContainerDied","Data":"fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.312338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" event={"ID":"5dd5cf86-df02-4983-86f2-430028f1d02d","Type":"ContainerStarted","Data":"e6023f3545fd4db942d55f8bfd1c1334f69ef458ce5ffc1d57b1af753871404f"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.330216 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62496296-571d-43a1-ba03-30e59f710293","Type":"ContainerStarted","Data":"3ada20ee7d7fb9faae89f0b386afadfc6dcb137759320a066244bd246f15a191"} Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.385709 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.432689419 podStartE2EDuration="7.385683717s" podCreationTimestamp="2025-10-14 13:20:05 +0000 UTC" firstStartedPulling="2025-10-14 13:20:05.983106632 +0000 UTC m=+1143.900106445" lastFinishedPulling="2025-10-14 13:20:10.93610093 +0000 UTC m=+1148.853100743" observedRunningTime="2025-10-14 13:20:12.315033525 +0000 UTC m=+1150.232033358" watchObservedRunningTime="2025-10-14 13:20:12.385683717 +0000 UTC m=+1150.302683520" Oct 14 13:20:12 crc kubenswrapper[4837]: I1014 13:20:12.638799 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r9shz"] Oct 14 13:20:12 crc kubenswrapper[4837]: W1014 13:20:12.717304 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc381d25_0a5e_438b_b2c8_45edc80148f4.slice/crio-bc14a8c90722fa0a3316d201087cfc0018f5daec3ab9cc69da9c8739ab18ea1f WatchSource:0}: Error finding container bc14a8c90722fa0a3316d201087cfc0018f5daec3ab9cc69da9c8739ab18ea1f: Status 404 returned error can't find the container with id bc14a8c90722fa0a3316d201087cfc0018f5daec3ab9cc69da9c8739ab18ea1f Oct 14 13:20:13 crc kubenswrapper[4837]: I1014 13:20:13.346476 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" event={"ID":"5dd5cf86-df02-4983-86f2-430028f1d02d","Type":"ContainerStarted","Data":"8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8"} Oct 14 13:20:13 crc kubenswrapper[4837]: I1014 13:20:13.346975 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:13 crc kubenswrapper[4837]: I1014 13:20:13.351677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r9shz" event={"ID":"bc381d25-0a5e-438b-b2c8-45edc80148f4","Type":"ContainerStarted","Data":"7ad0b1260b4f58255dd5eb02e380782519fc7ecb1c45ac008e0413662ceb9cf7"} Oct 14 13:20:13 crc kubenswrapper[4837]: I1014 13:20:13.351720 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r9shz" event={"ID":"bc381d25-0a5e-438b-b2c8-45edc80148f4","Type":"ContainerStarted","Data":"bc14a8c90722fa0a3316d201087cfc0018f5daec3ab9cc69da9c8739ab18ea1f"} Oct 14 13:20:13 crc kubenswrapper[4837]: I1014 13:20:13.376519 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" podStartSLOduration=3.376501626 podStartE2EDuration="3.376501626s" podCreationTimestamp="2025-10-14 13:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:13.367616939 +0000 UTC m=+1151.284616762" watchObservedRunningTime="2025-10-14 13:20:13.376501626 +0000 UTC m=+1151.293501439" Oct 14 13:20:13 crc kubenswrapper[4837]: I1014 13:20:13.398070 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-r9shz" podStartSLOduration=2.398049034 podStartE2EDuration="2.398049034s" podCreationTimestamp="2025-10-14 13:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:13.393468571 +0000 UTC m=+1151.310468384" watchObservedRunningTime="2025-10-14 13:20:13.398049034 +0000 UTC m=+1151.315048847" Oct 14 13:20:14 crc kubenswrapper[4837]: I1014 13:20:14.315495 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:14 crc kubenswrapper[4837]: I1014 13:20:14.326387 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.412025 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f" gracePeriod=30 Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.411982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3","Type":"ContainerStarted","Data":"e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f"} Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.416878 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f670486-ce95-470b-abdc-42ca55378cc2","Type":"ContainerStarted","Data":"62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40"} Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.420602 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731","Type":"ContainerStarted","Data":"c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb"} Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.420662 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731","Type":"ContainerStarted","Data":"278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16"} Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.424634 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62496296-571d-43a1-ba03-30e59f710293","Type":"ContainerStarted","Data":"70f41c47eed9c40e9856be2650fb7735fd88f63be1e1c9c94fef3d25b2e44ab1"} Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.424690 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62496296-571d-43a1-ba03-30e59f710293","Type":"ContainerStarted","Data":"d4226ecec4a49d4a1f3189b44aa2b30eb4b02612f08b154c694d8fd65a28fb89"} Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.424810 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-metadata" containerID="cri-o://70f41c47eed9c40e9856be2650fb7735fd88f63be1e1c9c94fef3d25b2e44ab1" gracePeriod=30 Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.424821 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-log" containerID="cri-o://d4226ecec4a49d4a1f3189b44aa2b30eb4b02612f08b154c694d8fd65a28fb89" gracePeriod=30 Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.437424 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.624118754 podStartE2EDuration="7.43740915s" podCreationTimestamp="2025-10-14 13:20:10 +0000 UTC" firstStartedPulling="2025-10-14 13:20:11.683247862 +0000 UTC m=+1149.600247675" lastFinishedPulling="2025-10-14 13:20:16.496538258 +0000 UTC m=+1154.413538071" observedRunningTime="2025-10-14 13:20:17.432875318 +0000 UTC m=+1155.349875121" watchObservedRunningTime="2025-10-14 13:20:17.43740915 +0000 UTC m=+1155.354408963" Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.453332 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.395977083 podStartE2EDuration="7.453313805s" podCreationTimestamp="2025-10-14 13:20:10 +0000 UTC" firstStartedPulling="2025-10-14 13:20:11.444796456 +0000 UTC m=+1149.361796269" lastFinishedPulling="2025-10-14 13:20:16.502133178 +0000 UTC m=+1154.419132991" observedRunningTime="2025-10-14 13:20:17.450202212 +0000 UTC m=+1155.367202015" watchObservedRunningTime="2025-10-14 13:20:17.453313805 +0000 UTC m=+1155.370313618" Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.471790 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.425084462 podStartE2EDuration="7.47177414s" podCreationTimestamp="2025-10-14 13:20:10 +0000 UTC" firstStartedPulling="2025-10-14 13:20:11.449329476 +0000 UTC m=+1149.366329289" lastFinishedPulling="2025-10-14 13:20:16.496019154 +0000 UTC m=+1154.413018967" observedRunningTime="2025-10-14 13:20:17.469233122 +0000 UTC m=+1155.386232945" watchObservedRunningTime="2025-10-14 13:20:17.47177414 +0000 UTC m=+1155.388773953" Oct 14 13:20:17 crc kubenswrapper[4837]: I1014 13:20:17.490367 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.564926149 podStartE2EDuration="7.490350408s" podCreationTimestamp="2025-10-14 13:20:10 +0000 UTC" firstStartedPulling="2025-10-14 13:20:11.571471038 +0000 UTC m=+1149.488470851" lastFinishedPulling="2025-10-14 13:20:16.496895297 +0000 UTC m=+1154.413895110" observedRunningTime="2025-10-14 13:20:17.486484784 +0000 UTC m=+1155.403484607" watchObservedRunningTime="2025-10-14 13:20:17.490350408 +0000 UTC m=+1155.407350221" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.438763 4837 generic.go:334] "Generic (PLEG): container finished" podID="62496296-571d-43a1-ba03-30e59f710293" containerID="70f41c47eed9c40e9856be2650fb7735fd88f63be1e1c9c94fef3d25b2e44ab1" exitCode=0 Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.439043 4837 generic.go:334] "Generic (PLEG): container finished" podID="62496296-571d-43a1-ba03-30e59f710293" containerID="d4226ecec4a49d4a1f3189b44aa2b30eb4b02612f08b154c694d8fd65a28fb89" exitCode=143 Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.440112 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62496296-571d-43a1-ba03-30e59f710293","Type":"ContainerDied","Data":"70f41c47eed9c40e9856be2650fb7735fd88f63be1e1c9c94fef3d25b2e44ab1"} Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.440171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62496296-571d-43a1-ba03-30e59f710293","Type":"ContainerDied","Data":"d4226ecec4a49d4a1f3189b44aa2b30eb4b02612f08b154c694d8fd65a28fb89"} Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.440186 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62496296-571d-43a1-ba03-30e59f710293","Type":"ContainerDied","Data":"3ada20ee7d7fb9faae89f0b386afadfc6dcb137759320a066244bd246f15a191"} Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.440198 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ada20ee7d7fb9faae89f0b386afadfc6dcb137759320a066244bd246f15a191" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.482180 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.585477 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62496296-571d-43a1-ba03-30e59f710293-logs\") pod \"62496296-571d-43a1-ba03-30e59f710293\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.585773 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-config-data\") pod \"62496296-571d-43a1-ba03-30e59f710293\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.585906 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5dfs\" (UniqueName: \"kubernetes.io/projected/62496296-571d-43a1-ba03-30e59f710293-kube-api-access-w5dfs\") pod \"62496296-571d-43a1-ba03-30e59f710293\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.585916 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62496296-571d-43a1-ba03-30e59f710293-logs" (OuterVolumeSpecName: "logs") pod "62496296-571d-43a1-ba03-30e59f710293" (UID: "62496296-571d-43a1-ba03-30e59f710293"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.586123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-combined-ca-bundle\") pod \"62496296-571d-43a1-ba03-30e59f710293\" (UID: \"62496296-571d-43a1-ba03-30e59f710293\") " Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.586951 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62496296-571d-43a1-ba03-30e59f710293-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.603315 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62496296-571d-43a1-ba03-30e59f710293-kube-api-access-w5dfs" (OuterVolumeSpecName: "kube-api-access-w5dfs") pod "62496296-571d-43a1-ba03-30e59f710293" (UID: "62496296-571d-43a1-ba03-30e59f710293"). InnerVolumeSpecName "kube-api-access-w5dfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.626947 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-config-data" (OuterVolumeSpecName: "config-data") pod "62496296-571d-43a1-ba03-30e59f710293" (UID: "62496296-571d-43a1-ba03-30e59f710293"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.630475 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62496296-571d-43a1-ba03-30e59f710293" (UID: "62496296-571d-43a1-ba03-30e59f710293"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.689461 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.689492 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5dfs\" (UniqueName: \"kubernetes.io/projected/62496296-571d-43a1-ba03-30e59f710293-kube-api-access-w5dfs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:18 crc kubenswrapper[4837]: I1014 13:20:18.689503 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62496296-571d-43a1-ba03-30e59f710293-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.446613 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.470560 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.478360 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.491832 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:19 crc kubenswrapper[4837]: E1014 13:20:19.492224 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-metadata" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.492240 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-metadata" Oct 14 13:20:19 crc kubenswrapper[4837]: E1014 13:20:19.492258 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-log" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.492266 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-log" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.492445 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-metadata" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.492464 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="62496296-571d-43a1-ba03-30e59f710293" containerName="nova-metadata-log" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.493480 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.498018 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.498270 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.505736 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.607279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-config-data\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.607707 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158326fa-0460-4794-8158-5729dbefa9dd-logs\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.607897 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.608111 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.608204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm9hh\" (UniqueName: \"kubernetes.io/projected/158326fa-0460-4794-8158-5729dbefa9dd-kube-api-access-cm9hh\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.710350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.710468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm9hh\" (UniqueName: \"kubernetes.io/projected/158326fa-0460-4794-8158-5729dbefa9dd-kube-api-access-cm9hh\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.710531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-config-data\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.710693 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158326fa-0460-4794-8158-5729dbefa9dd-logs\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.710795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.711303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158326fa-0460-4794-8158-5729dbefa9dd-logs\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.716302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.718228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.718964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-config-data\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.737840 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm9hh\" (UniqueName: \"kubernetes.io/projected/158326fa-0460-4794-8158-5729dbefa9dd-kube-api-access-cm9hh\") pod \"nova-metadata-0\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " pod="openstack/nova-metadata-0" Oct 14 13:20:19 crc kubenswrapper[4837]: I1014 13:20:19.821217 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.350086 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:20 crc kubenswrapper[4837]: W1014 13:20:20.350546 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod158326fa_0460_4794_8158_5729dbefa9dd.slice/crio-df3b5b5cee12b06142181caee8a12eb370a16533576f459ba1097974799aa9b0 WatchSource:0}: Error finding container df3b5b5cee12b06142181caee8a12eb370a16533576f459ba1097974799aa9b0: Status 404 returned error can't find the container with id df3b5b5cee12b06142181caee8a12eb370a16533576f459ba1097974799aa9b0 Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.466198 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158326fa-0460-4794-8158-5729dbefa9dd","Type":"ContainerStarted","Data":"df3b5b5cee12b06142181caee8a12eb370a16533576f459ba1097974799aa9b0"} Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.474665 4837 generic.go:334] "Generic (PLEG): container finished" podID="a6738cf5-e029-4393-9dcb-818a2e5ed0b3" containerID="443ff350c29b3303e59363224be1b5cd139d697736cb9756bc30bff5a1b8f62c" exitCode=0 Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.474720 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k4v9z" event={"ID":"a6738cf5-e029-4393-9dcb-818a2e5ed0b3","Type":"ContainerDied","Data":"443ff350c29b3303e59363224be1b5cd139d697736cb9756bc30bff5a1b8f62c"} Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.647111 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.647227 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.651646 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.651745 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.679929 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.794879 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62496296-571d-43a1-ba03-30e59f710293" path="/var/lib/kubelet/pods/62496296-571d-43a1-ba03-30e59f710293/volumes" Oct 14 13:20:20 crc kubenswrapper[4837]: I1014 13:20:20.957628 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.162557 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.231005 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-w7l7p"] Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.231732 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" containerName="dnsmasq-dns" containerID="cri-o://8fc388ef14a51ca6ea0f41bd8aa332ab173480faacd4862f2d31d1bb865c048f" gracePeriod=10 Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.496106 4837 generic.go:334] "Generic (PLEG): container finished" podID="8773b825-0f73-4a74-9d59-522f75f7b425" containerID="8fc388ef14a51ca6ea0f41bd8aa332ab173480faacd4862f2d31d1bb865c048f" exitCode=0 Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.496187 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" event={"ID":"8773b825-0f73-4a74-9d59-522f75f7b425","Type":"ContainerDied","Data":"8fc388ef14a51ca6ea0f41bd8aa332ab173480faacd4862f2d31d1bb865c048f"} Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.502949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158326fa-0460-4794-8158-5729dbefa9dd","Type":"ContainerStarted","Data":"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1"} Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.503120 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158326fa-0460-4794-8158-5729dbefa9dd","Type":"ContainerStarted","Data":"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0"} Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.535706 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5356900639999997 podStartE2EDuration="2.535690064s" podCreationTimestamp="2025-10-14 13:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:21.530034342 +0000 UTC m=+1159.447034165" watchObservedRunningTime="2025-10-14 13:20:21.535690064 +0000 UTC m=+1159.452689877" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.562185 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.729481 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.729509 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.844251 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.868901 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-config\") pod \"8773b825-0f73-4a74-9d59-522f75f7b425\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.871340 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9prks\" (UniqueName: \"kubernetes.io/projected/8773b825-0f73-4a74-9d59-522f75f7b425-kube-api-access-9prks\") pod \"8773b825-0f73-4a74-9d59-522f75f7b425\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.871381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-svc\") pod \"8773b825-0f73-4a74-9d59-522f75f7b425\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.871417 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-nb\") pod \"8773b825-0f73-4a74-9d59-522f75f7b425\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.871517 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-swift-storage-0\") pod \"8773b825-0f73-4a74-9d59-522f75f7b425\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.871582 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-sb\") pod \"8773b825-0f73-4a74-9d59-522f75f7b425\" (UID: \"8773b825-0f73-4a74-9d59-522f75f7b425\") " Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.894617 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8773b825-0f73-4a74-9d59-522f75f7b425-kube-api-access-9prks" (OuterVolumeSpecName: "kube-api-access-9prks") pod "8773b825-0f73-4a74-9d59-522f75f7b425" (UID: "8773b825-0f73-4a74-9d59-522f75f7b425"). InnerVolumeSpecName "kube-api-access-9prks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.953954 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8773b825-0f73-4a74-9d59-522f75f7b425" (UID: "8773b825-0f73-4a74-9d59-522f75f7b425"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.954038 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8773b825-0f73-4a74-9d59-522f75f7b425" (UID: "8773b825-0f73-4a74-9d59-522f75f7b425"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.958655 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8773b825-0f73-4a74-9d59-522f75f7b425" (UID: "8773b825-0f73-4a74-9d59-522f75f7b425"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.974028 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9prks\" (UniqueName: \"kubernetes.io/projected/8773b825-0f73-4a74-9d59-522f75f7b425-kube-api-access-9prks\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.974360 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.974431 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:21 crc kubenswrapper[4837]: I1014 13:20:21.974485 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.013299 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-config" (OuterVolumeSpecName: "config") pod "8773b825-0f73-4a74-9d59-522f75f7b425" (UID: "8773b825-0f73-4a74-9d59-522f75f7b425"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.043986 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8773b825-0f73-4a74-9d59-522f75f7b425" (UID: "8773b825-0f73-4a74-9d59-522f75f7b425"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.059643 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.076446 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.076479 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773b825-0f73-4a74-9d59-522f75f7b425-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.184321 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-scripts\") pod \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.184369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-combined-ca-bundle\") pod \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.184390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-config-data\") pod \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.184523 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4mnh\" (UniqueName: \"kubernetes.io/projected/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-kube-api-access-d4mnh\") pod \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\" (UID: \"a6738cf5-e029-4393-9dcb-818a2e5ed0b3\") " Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.188805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-scripts" (OuterVolumeSpecName: "scripts") pod "a6738cf5-e029-4393-9dcb-818a2e5ed0b3" (UID: "a6738cf5-e029-4393-9dcb-818a2e5ed0b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.193365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-kube-api-access-d4mnh" (OuterVolumeSpecName: "kube-api-access-d4mnh") pod "a6738cf5-e029-4393-9dcb-818a2e5ed0b3" (UID: "a6738cf5-e029-4393-9dcb-818a2e5ed0b3"). InnerVolumeSpecName "kube-api-access-d4mnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.243262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6738cf5-e029-4393-9dcb-818a2e5ed0b3" (UID: "a6738cf5-e029-4393-9dcb-818a2e5ed0b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.243380 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-config-data" (OuterVolumeSpecName: "config-data") pod "a6738cf5-e029-4393-9dcb-818a2e5ed0b3" (UID: "a6738cf5-e029-4393-9dcb-818a2e5ed0b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.288096 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4mnh\" (UniqueName: \"kubernetes.io/projected/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-kube-api-access-d4mnh\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.288138 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.288152 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.288196 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6738cf5-e029-4393-9dcb-818a2e5ed0b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.524867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" event={"ID":"8773b825-0f73-4a74-9d59-522f75f7b425","Type":"ContainerDied","Data":"853f07c4de0e2fe42461c1f73a6184b51dc4c822ed8d84e5141ecace62269a78"} Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.525023 4837 scope.go:117] "RemoveContainer" containerID="8fc388ef14a51ca6ea0f41bd8aa332ab173480faacd4862f2d31d1bb865c048f" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.525405 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-w7l7p" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.530800 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-k4v9z" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.532751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-k4v9z" event={"ID":"a6738cf5-e029-4393-9dcb-818a2e5ed0b3","Type":"ContainerDied","Data":"c1994f7536b6b2a0f4ab1bdd44cd0b6b38702fad0c30631fb713c52bccdc3bc2"} Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.532795 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1994f7536b6b2a0f4ab1bdd44cd0b6b38702fad0c30631fb713c52bccdc3bc2" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.562609 4837 scope.go:117] "RemoveContainer" containerID="267f2011388eaed0d785477edc6c93326ac3f5800a6cb94f5dff31bfbda6b23e" Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.600025 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-w7l7p"] Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.604650 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-w7l7p"] Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.703423 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.704010 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-log" containerID="cri-o://278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16" gracePeriod=30 Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.704108 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-api" containerID="cri-o://c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb" gracePeriod=30 Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.727821 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.767895 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:22 crc kubenswrapper[4837]: I1014 13:20:22.799712 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" path="/var/lib/kubelet/pods/8773b825-0f73-4a74-9d59-522f75f7b425/volumes" Oct 14 13:20:23 crc kubenswrapper[4837]: I1014 13:20:23.541599 4837 generic.go:334] "Generic (PLEG): container finished" podID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerID="278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16" exitCode=143 Oct 14 13:20:23 crc kubenswrapper[4837]: I1014 13:20:23.541704 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731","Type":"ContainerDied","Data":"278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16"} Oct 14 13:20:23 crc kubenswrapper[4837]: I1014 13:20:23.544270 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-log" containerID="cri-o://9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0" gracePeriod=30 Oct 14 13:20:23 crc kubenswrapper[4837]: I1014 13:20:23.544417 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-metadata" containerID="cri-o://480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1" gracePeriod=30 Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.101274 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.223185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-nova-metadata-tls-certs\") pod \"158326fa-0460-4794-8158-5729dbefa9dd\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.223256 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-config-data\") pod \"158326fa-0460-4794-8158-5729dbefa9dd\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.223361 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm9hh\" (UniqueName: \"kubernetes.io/projected/158326fa-0460-4794-8158-5729dbefa9dd-kube-api-access-cm9hh\") pod \"158326fa-0460-4794-8158-5729dbefa9dd\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.223431 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-combined-ca-bundle\") pod \"158326fa-0460-4794-8158-5729dbefa9dd\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.223491 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158326fa-0460-4794-8158-5729dbefa9dd-logs\") pod \"158326fa-0460-4794-8158-5729dbefa9dd\" (UID: \"158326fa-0460-4794-8158-5729dbefa9dd\") " Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.224349 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158326fa-0460-4794-8158-5729dbefa9dd-logs" (OuterVolumeSpecName: "logs") pod "158326fa-0460-4794-8158-5729dbefa9dd" (UID: "158326fa-0460-4794-8158-5729dbefa9dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.229705 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158326fa-0460-4794-8158-5729dbefa9dd-kube-api-access-cm9hh" (OuterVolumeSpecName: "kube-api-access-cm9hh") pod "158326fa-0460-4794-8158-5729dbefa9dd" (UID: "158326fa-0460-4794-8158-5729dbefa9dd"). InnerVolumeSpecName "kube-api-access-cm9hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.260047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-config-data" (OuterVolumeSpecName: "config-data") pod "158326fa-0460-4794-8158-5729dbefa9dd" (UID: "158326fa-0460-4794-8158-5729dbefa9dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.269015 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "158326fa-0460-4794-8158-5729dbefa9dd" (UID: "158326fa-0460-4794-8158-5729dbefa9dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.306521 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "158326fa-0460-4794-8158-5729dbefa9dd" (UID: "158326fa-0460-4794-8158-5729dbefa9dd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.325819 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.325872 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158326fa-0460-4794-8158-5729dbefa9dd-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.325886 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.325900 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158326fa-0460-4794-8158-5729dbefa9dd-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.325913 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm9hh\" (UniqueName: \"kubernetes.io/projected/158326fa-0460-4794-8158-5729dbefa9dd-kube-api-access-cm9hh\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567742 4837 generic.go:334] "Generic (PLEG): container finished" podID="158326fa-0460-4794-8158-5729dbefa9dd" containerID="480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1" exitCode=0 Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567779 4837 generic.go:334] "Generic (PLEG): container finished" podID="158326fa-0460-4794-8158-5729dbefa9dd" containerID="9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0" exitCode=143 Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158326fa-0460-4794-8158-5729dbefa9dd","Type":"ContainerDied","Data":"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1"} Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158326fa-0460-4794-8158-5729dbefa9dd","Type":"ContainerDied","Data":"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0"} Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158326fa-0460-4794-8158-5729dbefa9dd","Type":"ContainerDied","Data":"df3b5b5cee12b06142181caee8a12eb370a16533576f459ba1097974799aa9b0"} Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567898 4837 scope.go:117] "RemoveContainer" containerID="480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567961 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9f670486-ce95-470b-abdc-42ca55378cc2" containerName="nova-scheduler-scheduler" containerID="cri-o://62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" gracePeriod=30 Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.567896 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.603699 4837 scope.go:117] "RemoveContainer" containerID="9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.623461 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.630498 4837 scope.go:117] "RemoveContainer" containerID="480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1" Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.631017 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1\": container with ID starting with 480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1 not found: ID does not exist" containerID="480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.631059 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1"} err="failed to get container status \"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1\": rpc error: code = NotFound desc = could not find container \"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1\": container with ID starting with 480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1 not found: ID does not exist" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.631094 4837 scope.go:117] "RemoveContainer" containerID="9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0" Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.632353 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0\": container with ID starting with 9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0 not found: ID does not exist" containerID="9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.632389 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0"} err="failed to get container status \"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0\": rpc error: code = NotFound desc = could not find container \"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0\": container with ID starting with 9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0 not found: ID does not exist" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.632412 4837 scope.go:117] "RemoveContainer" containerID="480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.634665 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1"} err="failed to get container status \"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1\": rpc error: code = NotFound desc = could not find container \"480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1\": container with ID starting with 480d5a819261f6035b56725c583f5f0289dd54f6066aa7ca5cdf07ae1343c1a1 not found: ID does not exist" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.634709 4837 scope.go:117] "RemoveContainer" containerID="9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.634977 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0"} err="failed to get container status \"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0\": rpc error: code = NotFound desc = could not find container \"9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0\": container with ID starting with 9b4e8582a02944e4839f091523a536fbfb8c604f7f736a7ed9304e209e308cb0 not found: ID does not exist" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.642981 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.653708 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.654580 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-metadata" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.654615 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-metadata" Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.654639 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" containerName="dnsmasq-dns" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.654650 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" containerName="dnsmasq-dns" Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.654683 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-log" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.654694 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-log" Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.654724 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" containerName="init" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.654734 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" containerName="init" Oct 14 13:20:24 crc kubenswrapper[4837]: E1014 13:20:24.654768 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6738cf5-e029-4393-9dcb-818a2e5ed0b3" containerName="nova-manage" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.654777 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6738cf5-e029-4393-9dcb-818a2e5ed0b3" containerName="nova-manage" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.655002 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-log" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.655032 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="158326fa-0460-4794-8158-5729dbefa9dd" containerName="nova-metadata-metadata" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.655045 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8773b825-0f73-4a74-9d59-522f75f7b425" containerName="dnsmasq-dns" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.655063 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6738cf5-e029-4393-9dcb-818a2e5ed0b3" containerName="nova-manage" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.656690 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.658991 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.659676 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.660871 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.735995 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jh9p\" (UniqueName: \"kubernetes.io/projected/90a4e22d-dd23-427e-adbe-06cc729b517d-kube-api-access-8jh9p\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.736179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-config-data\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.736273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.736451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4e22d-dd23-427e-adbe-06cc729b517d-logs\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.736527 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.795362 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158326fa-0460-4794-8158-5729dbefa9dd" path="/var/lib/kubelet/pods/158326fa-0460-4794-8158-5729dbefa9dd/volumes" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.838136 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4e22d-dd23-427e-adbe-06cc729b517d-logs\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.838540 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.838633 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jh9p\" (UniqueName: \"kubernetes.io/projected/90a4e22d-dd23-427e-adbe-06cc729b517d-kube-api-access-8jh9p\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.838670 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-config-data\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.838712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.839527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4e22d-dd23-427e-adbe-06cc729b517d-logs\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.843922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.844325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.845529 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-config-data\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.857729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jh9p\" (UniqueName: \"kubernetes.io/projected/90a4e22d-dd23-427e-adbe-06cc729b517d-kube-api-access-8jh9p\") pod \"nova-metadata-0\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " pod="openstack/nova-metadata-0" Oct 14 13:20:24 crc kubenswrapper[4837]: I1014 13:20:24.988047 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:20:25 crc kubenswrapper[4837]: I1014 13:20:25.442114 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:20:25 crc kubenswrapper[4837]: W1014 13:20:25.442983 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90a4e22d_dd23_427e_adbe_06cc729b517d.slice/crio-84441172a1c58f2dcb3519e1a19e058956b101a0e6ebc7487b375c1e90159e24 WatchSource:0}: Error finding container 84441172a1c58f2dcb3519e1a19e058956b101a0e6ebc7487b375c1e90159e24: Status 404 returned error can't find the container with id 84441172a1c58f2dcb3519e1a19e058956b101a0e6ebc7487b375c1e90159e24 Oct 14 13:20:25 crc kubenswrapper[4837]: I1014 13:20:25.578661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90a4e22d-dd23-427e-adbe-06cc729b517d","Type":"ContainerStarted","Data":"84441172a1c58f2dcb3519e1a19e058956b101a0e6ebc7487b375c1e90159e24"} Oct 14 13:20:25 crc kubenswrapper[4837]: E1014 13:20:25.659819 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 14 13:20:25 crc kubenswrapper[4837]: E1014 13:20:25.662029 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 14 13:20:25 crc kubenswrapper[4837]: E1014 13:20:25.666063 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 14 13:20:25 crc kubenswrapper[4837]: E1014 13:20:25.666128 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9f670486-ce95-470b-abdc-42ca55378cc2" containerName="nova-scheduler-scheduler" Oct 14 13:20:26 crc kubenswrapper[4837]: I1014 13:20:26.668057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90a4e22d-dd23-427e-adbe-06cc729b517d","Type":"ContainerStarted","Data":"51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e"} Oct 14 13:20:26 crc kubenswrapper[4837]: I1014 13:20:26.668122 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90a4e22d-dd23-427e-adbe-06cc729b517d","Type":"ContainerStarted","Data":"91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2"} Oct 14 13:20:26 crc kubenswrapper[4837]: I1014 13:20:26.691197 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.691178425 podStartE2EDuration="2.691178425s" podCreationTimestamp="2025-10-14 13:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:26.68875075 +0000 UTC m=+1164.605750563" watchObservedRunningTime="2025-10-14 13:20:26.691178425 +0000 UTC m=+1164.608178238" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.563283 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.678524 4837 generic.go:334] "Generic (PLEG): container finished" podID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerID="c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb" exitCode=0 Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.678556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731","Type":"ContainerDied","Data":"c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb"} Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.678881 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731","Type":"ContainerDied","Data":"08bc5b4f165879548a6ac4c7f6c7d47aceccf67a77cd877569d1c7b087754d15"} Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.678918 4837 scope.go:117] "RemoveContainer" containerID="c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.678572 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.685239 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vfp2\" (UniqueName: \"kubernetes.io/projected/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-kube-api-access-7vfp2\") pod \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.685273 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-logs\") pod \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.685307 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-config-data\") pod \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.685427 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-combined-ca-bundle\") pod \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\" (UID: \"3cd02d6a-0cb7-47ef-b8e2-c860d7acb731\") " Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.685806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-logs" (OuterVolumeSpecName: "logs") pod "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" (UID: "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.690494 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-kube-api-access-7vfp2" (OuterVolumeSpecName: "kube-api-access-7vfp2") pod "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" (UID: "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731"). InnerVolumeSpecName "kube-api-access-7vfp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.704485 4837 scope.go:117] "RemoveContainer" containerID="278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.714084 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-config-data" (OuterVolumeSpecName: "config-data") pod "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" (UID: "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.720597 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" (UID: "3cd02d6a-0cb7-47ef-b8e2-c860d7acb731"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.787574 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vfp2\" (UniqueName: \"kubernetes.io/projected/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-kube-api-access-7vfp2\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.787607 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.787621 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.787633 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.795941 4837 scope.go:117] "RemoveContainer" containerID="c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb" Oct 14 13:20:27 crc kubenswrapper[4837]: E1014 13:20:27.796782 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb\": container with ID starting with c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb not found: ID does not exist" containerID="c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.796813 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb"} err="failed to get container status \"c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb\": rpc error: code = NotFound desc = could not find container \"c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb\": container with ID starting with c5519cc8eddcdd013f055de1089e5fdb378fd4d6b44497dbff20bd20226352fb not found: ID does not exist" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.796833 4837 scope.go:117] "RemoveContainer" containerID="278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16" Oct 14 13:20:27 crc kubenswrapper[4837]: E1014 13:20:27.797138 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16\": container with ID starting with 278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16 not found: ID does not exist" containerID="278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16" Oct 14 13:20:27 crc kubenswrapper[4837]: I1014 13:20:27.797174 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16"} err="failed to get container status \"278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16\": rpc error: code = NotFound desc = could not find container \"278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16\": container with ID starting with 278b3dd271b1d542382ee9678a36e7e0072579a390c6393ad5dbe5fdc495fe16 not found: ID does not exist" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.056875 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.067467 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.076265 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:28 crc kubenswrapper[4837]: E1014 13:20:28.076815 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-log" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.076845 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-log" Oct 14 13:20:28 crc kubenswrapper[4837]: E1014 13:20:28.076860 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-api" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.076868 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-api" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.077120 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-log" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.077201 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" containerName="nova-api-api" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.078453 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.080516 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.086194 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.195185 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnl69\" (UniqueName: \"kubernetes.io/projected/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-kube-api-access-tnl69\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.195322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-logs\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.195599 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.195982 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-config-data\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.297418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnl69\" (UniqueName: \"kubernetes.io/projected/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-kube-api-access-tnl69\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.297507 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-logs\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.297648 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.297773 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-config-data\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.298939 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-logs\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.302219 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-config-data\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.302784 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.315594 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnl69\" (UniqueName: \"kubernetes.io/projected/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-kube-api-access-tnl69\") pod \"nova-api-0\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.394622 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.690448 4837 generic.go:334] "Generic (PLEG): container finished" podID="bc381d25-0a5e-438b-b2c8-45edc80148f4" containerID="7ad0b1260b4f58255dd5eb02e380782519fc7ecb1c45ac008e0413662ceb9cf7" exitCode=0 Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.690502 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r9shz" event={"ID":"bc381d25-0a5e-438b-b2c8-45edc80148f4","Type":"ContainerDied","Data":"7ad0b1260b4f58255dd5eb02e380782519fc7ecb1c45ac008e0413662ceb9cf7"} Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.795712 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd02d6a-0cb7-47ef-b8e2-c860d7acb731" path="/var/lib/kubelet/pods/3cd02d6a-0cb7-47ef-b8e2-c860d7acb731/volumes" Oct 14 13:20:28 crc kubenswrapper[4837]: I1014 13:20:28.889429 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:28 crc kubenswrapper[4837]: W1014 13:20:28.897751 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad51b65_0353_4bd6_8bb9_b5cc7943e934.slice/crio-e9b41635a5e7dc6379f74f890cb7d6299f4ac25bf81c303ba43a52316bd09534 WatchSource:0}: Error finding container e9b41635a5e7dc6379f74f890cb7d6299f4ac25bf81c303ba43a52316bd09534: Status 404 returned error can't find the container with id e9b41635a5e7dc6379f74f890cb7d6299f4ac25bf81c303ba43a52316bd09534 Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.155476 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.221050 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlqjz\" (UniqueName: \"kubernetes.io/projected/9f670486-ce95-470b-abdc-42ca55378cc2-kube-api-access-nlqjz\") pod \"9f670486-ce95-470b-abdc-42ca55378cc2\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.221282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-config-data\") pod \"9f670486-ce95-470b-abdc-42ca55378cc2\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.221318 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-combined-ca-bundle\") pod \"9f670486-ce95-470b-abdc-42ca55378cc2\" (UID: \"9f670486-ce95-470b-abdc-42ca55378cc2\") " Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.225398 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f670486-ce95-470b-abdc-42ca55378cc2-kube-api-access-nlqjz" (OuterVolumeSpecName: "kube-api-access-nlqjz") pod "9f670486-ce95-470b-abdc-42ca55378cc2" (UID: "9f670486-ce95-470b-abdc-42ca55378cc2"). InnerVolumeSpecName "kube-api-access-nlqjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.248347 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f670486-ce95-470b-abdc-42ca55378cc2" (UID: "9f670486-ce95-470b-abdc-42ca55378cc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.252457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-config-data" (OuterVolumeSpecName: "config-data") pod "9f670486-ce95-470b-abdc-42ca55378cc2" (UID: "9f670486-ce95-470b-abdc-42ca55378cc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.323448 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlqjz\" (UniqueName: \"kubernetes.io/projected/9f670486-ce95-470b-abdc-42ca55378cc2-kube-api-access-nlqjz\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.323486 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.323496 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f670486-ce95-470b-abdc-42ca55378cc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.703040 4837 generic.go:334] "Generic (PLEG): container finished" podID="9f670486-ce95-470b-abdc-42ca55378cc2" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" exitCode=0 Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.703109 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f670486-ce95-470b-abdc-42ca55378cc2","Type":"ContainerDied","Data":"62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40"} Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.703137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f670486-ce95-470b-abdc-42ca55378cc2","Type":"ContainerDied","Data":"947e9670a6c1ceae1de2d84d754e445b68706d3eaa50aa7b52573081192dec69"} Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.703169 4837 scope.go:117] "RemoveContainer" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.703197 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.705068 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ad51b65-0353-4bd6-8bb9-b5cc7943e934","Type":"ContainerStarted","Data":"474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e"} Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.705104 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ad51b65-0353-4bd6-8bb9-b5cc7943e934","Type":"ContainerStarted","Data":"95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1"} Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.705114 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ad51b65-0353-4bd6-8bb9-b5cc7943e934","Type":"ContainerStarted","Data":"e9b41635a5e7dc6379f74f890cb7d6299f4ac25bf81c303ba43a52316bd09534"} Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.739866 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.739846634 podStartE2EDuration="1.739846634s" podCreationTimestamp="2025-10-14 13:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:29.737993436 +0000 UTC m=+1167.654993249" watchObservedRunningTime="2025-10-14 13:20:29.739846634 +0000 UTC m=+1167.656846447" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.752911 4837 scope.go:117] "RemoveContainer" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" Oct 14 13:20:29 crc kubenswrapper[4837]: E1014 13:20:29.753556 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40\": container with ID starting with 62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40 not found: ID does not exist" containerID="62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.753605 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40"} err="failed to get container status \"62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40\": rpc error: code = NotFound desc = could not find container \"62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40\": container with ID starting with 62da9211735aac5ff15f45e4585b1ee47f30c296196c6b3fdc843649aafbec40 not found: ID does not exist" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.759964 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.771457 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.785193 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:29 crc kubenswrapper[4837]: E1014 13:20:29.785561 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f670486-ce95-470b-abdc-42ca55378cc2" containerName="nova-scheduler-scheduler" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.785574 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f670486-ce95-470b-abdc-42ca55378cc2" containerName="nova-scheduler-scheduler" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.785763 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f670486-ce95-470b-abdc-42ca55378cc2" containerName="nova-scheduler-scheduler" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.786316 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.792214 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.804464 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.931356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6cfl\" (UniqueName: \"kubernetes.io/projected/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-kube-api-access-j6cfl\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.931468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-config-data\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.931492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.988434 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:20:29 crc kubenswrapper[4837]: I1014 13:20:29.988549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.033592 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6cfl\" (UniqueName: \"kubernetes.io/projected/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-kube-api-access-j6cfl\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.033663 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-config-data\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.033692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.039666 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.040522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-config-data\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.049974 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6cfl\" (UniqueName: \"kubernetes.io/projected/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-kube-api-access-j6cfl\") pod \"nova-scheduler-0\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.112883 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.213936 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.339710 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8rxc\" (UniqueName: \"kubernetes.io/projected/bc381d25-0a5e-438b-b2c8-45edc80148f4-kube-api-access-f8rxc\") pod \"bc381d25-0a5e-438b-b2c8-45edc80148f4\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.339789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-config-data\") pod \"bc381d25-0a5e-438b-b2c8-45edc80148f4\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.339867 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-scripts\") pod \"bc381d25-0a5e-438b-b2c8-45edc80148f4\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.340016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-combined-ca-bundle\") pod \"bc381d25-0a5e-438b-b2c8-45edc80148f4\" (UID: \"bc381d25-0a5e-438b-b2c8-45edc80148f4\") " Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.343876 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc381d25-0a5e-438b-b2c8-45edc80148f4-kube-api-access-f8rxc" (OuterVolumeSpecName: "kube-api-access-f8rxc") pod "bc381d25-0a5e-438b-b2c8-45edc80148f4" (UID: "bc381d25-0a5e-438b-b2c8-45edc80148f4"). InnerVolumeSpecName "kube-api-access-f8rxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.348372 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-scripts" (OuterVolumeSpecName: "scripts") pod "bc381d25-0a5e-438b-b2c8-45edc80148f4" (UID: "bc381d25-0a5e-438b-b2c8-45edc80148f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.381654 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-config-data" (OuterVolumeSpecName: "config-data") pod "bc381d25-0a5e-438b-b2c8-45edc80148f4" (UID: "bc381d25-0a5e-438b-b2c8-45edc80148f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.383279 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc381d25-0a5e-438b-b2c8-45edc80148f4" (UID: "bc381d25-0a5e-438b-b2c8-45edc80148f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.443494 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.443580 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8rxc\" (UniqueName: \"kubernetes.io/projected/bc381d25-0a5e-438b-b2c8-45edc80148f4-kube-api-access-f8rxc\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.443602 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.443620 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc381d25-0a5e-438b-b2c8-45edc80148f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.551038 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.717577 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r9shz" event={"ID":"bc381d25-0a5e-438b-b2c8-45edc80148f4","Type":"ContainerDied","Data":"bc14a8c90722fa0a3316d201087cfc0018f5daec3ab9cc69da9c8739ab18ea1f"} Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.717862 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc14a8c90722fa0a3316d201087cfc0018f5daec3ab9cc69da9c8739ab18ea1f" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.717928 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r9shz" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.721442 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a","Type":"ContainerStarted","Data":"4f73919796828e692b212384610081075c1398b8a54aba28020793a05a92e5a2"} Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.844925 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f670486-ce95-470b-abdc-42ca55378cc2" path="/var/lib/kubelet/pods/9f670486-ce95-470b-abdc-42ca55378cc2/volumes" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.845608 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 14 13:20:30 crc kubenswrapper[4837]: E1014 13:20:30.846196 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc381d25-0a5e-438b-b2c8-45edc80148f4" containerName="nova-cell1-conductor-db-sync" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.846264 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc381d25-0a5e-438b-b2c8-45edc80148f4" containerName="nova-cell1-conductor-db-sync" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.846950 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc381d25-0a5e-438b-b2c8-45edc80148f4" containerName="nova-cell1-conductor-db-sync" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.848181 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.852982 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.855320 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.861911 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2118184-3d60-4d7a-b203-961341c9be78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.862810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2118184-3d60-4d7a-b203-961341c9be78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.862881 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxdx\" (UniqueName: \"kubernetes.io/projected/c2118184-3d60-4d7a-b203-961341c9be78-kube-api-access-tlxdx\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.963683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2118184-3d60-4d7a-b203-961341c9be78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.963771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxdx\" (UniqueName: \"kubernetes.io/projected/c2118184-3d60-4d7a-b203-961341c9be78-kube-api-access-tlxdx\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.963838 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2118184-3d60-4d7a-b203-961341c9be78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.969265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2118184-3d60-4d7a-b203-961341c9be78-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.970838 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2118184-3d60-4d7a-b203-961341c9be78-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:30 crc kubenswrapper[4837]: I1014 13:20:30.979768 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxdx\" (UniqueName: \"kubernetes.io/projected/c2118184-3d60-4d7a-b203-961341c9be78-kube-api-access-tlxdx\") pod \"nova-cell1-conductor-0\" (UID: \"c2118184-3d60-4d7a-b203-961341c9be78\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:31 crc kubenswrapper[4837]: I1014 13:20:31.177771 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:31 crc kubenswrapper[4837]: I1014 13:20:31.623461 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 14 13:20:31 crc kubenswrapper[4837]: W1014 13:20:31.628035 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2118184_3d60_4d7a_b203_961341c9be78.slice/crio-ea5347afc2382d986366c5210426a895c10e19485725e5667a729dc5dd956510 WatchSource:0}: Error finding container ea5347afc2382d986366c5210426a895c10e19485725e5667a729dc5dd956510: Status 404 returned error can't find the container with id ea5347afc2382d986366c5210426a895c10e19485725e5667a729dc5dd956510 Oct 14 13:20:31 crc kubenswrapper[4837]: I1014 13:20:31.742032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c2118184-3d60-4d7a-b203-961341c9be78","Type":"ContainerStarted","Data":"ea5347afc2382d986366c5210426a895c10e19485725e5667a729dc5dd956510"} Oct 14 13:20:31 crc kubenswrapper[4837]: I1014 13:20:31.745802 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a","Type":"ContainerStarted","Data":"3005aebb5857896971649463f1bfd5e6d34b881d8dd10c31dd6936c0be8c2bde"} Oct 14 13:20:31 crc kubenswrapper[4837]: I1014 13:20:31.774393 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.774373291 podStartE2EDuration="2.774373291s" podCreationTimestamp="2025-10-14 13:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:31.772843159 +0000 UTC m=+1169.689842972" watchObservedRunningTime="2025-10-14 13:20:31.774373291 +0000 UTC m=+1169.691373114" Oct 14 13:20:32 crc kubenswrapper[4837]: I1014 13:20:32.756783 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c2118184-3d60-4d7a-b203-961341c9be78","Type":"ContainerStarted","Data":"7c8677a519ab8879e649b3fc4d3d8d545390d3c61d6fa4984330af47fec0eacb"} Oct 14 13:20:32 crc kubenswrapper[4837]: I1014 13:20:32.757239 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:32 crc kubenswrapper[4837]: I1014 13:20:32.783156 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.783134721 podStartE2EDuration="2.783134721s" podCreationTimestamp="2025-10-14 13:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:32.772723542 +0000 UTC m=+1170.689723355" watchObservedRunningTime="2025-10-14 13:20:32.783134721 +0000 UTC m=+1170.700134544" Oct 14 13:20:34 crc kubenswrapper[4837]: I1014 13:20:34.988414 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:20:34 crc kubenswrapper[4837]: I1014 13:20:34.988960 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:20:35 crc kubenswrapper[4837]: I1014 13:20:35.113573 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 14 13:20:35 crc kubenswrapper[4837]: I1014 13:20:35.508879 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:20:36 crc kubenswrapper[4837]: I1014 13:20:36.006366 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:20:36 crc kubenswrapper[4837]: I1014 13:20:36.006389 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:20:36 crc kubenswrapper[4837]: I1014 13:20:36.218563 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 14 13:20:38 crc kubenswrapper[4837]: I1014 13:20:38.395897 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:20:38 crc kubenswrapper[4837]: I1014 13:20:38.396268 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:20:39 crc kubenswrapper[4837]: I1014 13:20:39.161900 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:20:39 crc kubenswrapper[4837]: I1014 13:20:39.162108 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="561b6993-3913-4d1a-89d8-c146f5e2bd6a" containerName="kube-state-metrics" containerID="cri-o://172a03390e7002c502ec2b291ec110442330b698578d9984e74b70c1b89c66a8" gracePeriod=30 Oct 14 13:20:39 crc kubenswrapper[4837]: I1014 13:20:39.479333 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:20:39 crc kubenswrapper[4837]: I1014 13:20:39.479357 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:20:39 crc kubenswrapper[4837]: I1014 13:20:39.840903 4837 generic.go:334] "Generic (PLEG): container finished" podID="561b6993-3913-4d1a-89d8-c146f5e2bd6a" containerID="172a03390e7002c502ec2b291ec110442330b698578d9984e74b70c1b89c66a8" exitCode=2 Oct 14 13:20:39 crc kubenswrapper[4837]: I1014 13:20:39.841693 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"561b6993-3913-4d1a-89d8-c146f5e2bd6a","Type":"ContainerDied","Data":"172a03390e7002c502ec2b291ec110442330b698578d9984e74b70c1b89c66a8"} Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.113562 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.145538 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.153738 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.250908 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jt9z\" (UniqueName: \"kubernetes.io/projected/561b6993-3913-4d1a-89d8-c146f5e2bd6a-kube-api-access-9jt9z\") pod \"561b6993-3913-4d1a-89d8-c146f5e2bd6a\" (UID: \"561b6993-3913-4d1a-89d8-c146f5e2bd6a\") " Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.260494 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561b6993-3913-4d1a-89d8-c146f5e2bd6a-kube-api-access-9jt9z" (OuterVolumeSpecName: "kube-api-access-9jt9z") pod "561b6993-3913-4d1a-89d8-c146f5e2bd6a" (UID: "561b6993-3913-4d1a-89d8-c146f5e2bd6a"). InnerVolumeSpecName "kube-api-access-9jt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.353151 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jt9z\" (UniqueName: \"kubernetes.io/projected/561b6993-3913-4d1a-89d8-c146f5e2bd6a-kube-api-access-9jt9z\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.851637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"561b6993-3913-4d1a-89d8-c146f5e2bd6a","Type":"ContainerDied","Data":"09ef5f60d18414704094a910cf86c075c41a8321829f4583de40095b9650c39f"} Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.851689 4837 scope.go:117] "RemoveContainer" containerID="172a03390e7002c502ec2b291ec110442330b698578d9984e74b70c1b89c66a8" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.851907 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.892549 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.914922 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.928217 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.938433 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:20:40 crc kubenswrapper[4837]: E1014 13:20:40.938798 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561b6993-3913-4d1a-89d8-c146f5e2bd6a" containerName="kube-state-metrics" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.938816 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="561b6993-3913-4d1a-89d8-c146f5e2bd6a" containerName="kube-state-metrics" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.939007 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="561b6993-3913-4d1a-89d8-c146f5e2bd6a" containerName="kube-state-metrics" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.939612 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.943059 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.943189 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 14 13:20:40 crc kubenswrapper[4837]: I1014 13:20:40.946090 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.066402 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.066570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.066628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs4lv\" (UniqueName: \"kubernetes.io/projected/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-api-access-vs4lv\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.066789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.113455 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.113882 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-central-agent" containerID="cri-o://b2a459432b061fe738ecf8e213e4c4f7b47c965d4e77f61bf36ceed898c029ab" gracePeriod=30 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.113947 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="proxy-httpd" containerID="cri-o://f6893e44a60a0237de489adef9ca8e9d742555369287b358ef14d7b245eba328" gracePeriod=30 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.114036 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-notification-agent" containerID="cri-o://af9c55a9b7255ca55cd99331f2151ad7f9390a493ad77c04f04643b6c5d20198" gracePeriod=30 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.114264 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="sg-core" containerID="cri-o://50ddee8ada23b5c7cf4616d1026eaf7f820614399d32a171018078b4a94d9a4c" gracePeriod=30 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.168770 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4lv\" (UniqueName: \"kubernetes.io/projected/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-api-access-vs4lv\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.168875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.168995 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.169035 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.174685 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.178244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.178508 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.202245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs4lv\" (UniqueName: \"kubernetes.io/projected/b20b7196-1920-4b12-a38e-7356ca4dc4e2-kube-api-access-vs4lv\") pod \"kube-state-metrics-0\" (UID: \"b20b7196-1920-4b12-a38e-7356ca4dc4e2\") " pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.267199 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.824944 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.842789 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.869914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b20b7196-1920-4b12-a38e-7356ca4dc4e2","Type":"ContainerStarted","Data":"dd5d412c545eb7de4e5de0eba472542182e3c482dc17e08fbea5998aee3bd4ab"} Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.873389 4837 generic.go:334] "Generic (PLEG): container finished" podID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerID="f6893e44a60a0237de489adef9ca8e9d742555369287b358ef14d7b245eba328" exitCode=0 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.873424 4837 generic.go:334] "Generic (PLEG): container finished" podID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerID="50ddee8ada23b5c7cf4616d1026eaf7f820614399d32a171018078b4a94d9a4c" exitCode=2 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.873437 4837 generic.go:334] "Generic (PLEG): container finished" podID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerID="b2a459432b061fe738ecf8e213e4c4f7b47c965d4e77f61bf36ceed898c029ab" exitCode=0 Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.873498 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerDied","Data":"f6893e44a60a0237de489adef9ca8e9d742555369287b358ef14d7b245eba328"} Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.873528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerDied","Data":"50ddee8ada23b5c7cf4616d1026eaf7f820614399d32a171018078b4a94d9a4c"} Oct 14 13:20:41 crc kubenswrapper[4837]: I1014 13:20:41.873543 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerDied","Data":"b2a459432b061fe738ecf8e213e4c4f7b47c965d4e77f61bf36ceed898c029ab"} Oct 14 13:20:42 crc kubenswrapper[4837]: I1014 13:20:42.800415 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561b6993-3913-4d1a-89d8-c146f5e2bd6a" path="/var/lib/kubelet/pods/561b6993-3913-4d1a-89d8-c146f5e2bd6a/volumes" Oct 14 13:20:42 crc kubenswrapper[4837]: I1014 13:20:42.886484 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b20b7196-1920-4b12-a38e-7356ca4dc4e2","Type":"ContainerStarted","Data":"e156fd014a540a51f6e03804e120dc44f4545840f243020875906c717caa8f1f"} Oct 14 13:20:42 crc kubenswrapper[4837]: I1014 13:20:42.887576 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 14 13:20:44 crc kubenswrapper[4837]: I1014 13:20:44.995668 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:20:45 crc kubenswrapper[4837]: I1014 13:20:45.004954 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:20:45 crc kubenswrapper[4837]: I1014 13:20:45.010091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:20:45 crc kubenswrapper[4837]: I1014 13:20:45.025355 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=4.672328286 podStartE2EDuration="5.025332221s" podCreationTimestamp="2025-10-14 13:20:40 +0000 UTC" firstStartedPulling="2025-10-14 13:20:41.84258251 +0000 UTC m=+1179.759582323" lastFinishedPulling="2025-10-14 13:20:42.195586435 +0000 UTC m=+1180.112586258" observedRunningTime="2025-10-14 13:20:42.910252248 +0000 UTC m=+1180.827252071" watchObservedRunningTime="2025-10-14 13:20:45.025332221 +0000 UTC m=+1182.942332044" Oct 14 13:20:45 crc kubenswrapper[4837]: I1014 13:20:45.921194 4837 generic.go:334] "Generic (PLEG): container finished" podID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerID="af9c55a9b7255ca55cd99331f2151ad7f9390a493ad77c04f04643b6c5d20198" exitCode=0 Oct 14 13:20:45 crc kubenswrapper[4837]: I1014 13:20:45.921296 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerDied","Data":"af9c55a9b7255ca55cd99331f2151ad7f9390a493ad77c04f04643b6c5d20198"} Oct 14 13:20:45 crc kubenswrapper[4837]: I1014 13:20:45.932530 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.132658 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.275956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxmr7\" (UniqueName: \"kubernetes.io/projected/cd31b711-5af6-4b4e-89eb-65085f9b88e6-kube-api-access-nxmr7\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276050 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-sg-core-conf-yaml\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-config-data\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276233 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-run-httpd\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-scripts\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276304 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-combined-ca-bundle\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276362 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-log-httpd\") pod \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\" (UID: \"cd31b711-5af6-4b4e-89eb-65085f9b88e6\") " Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.276978 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.277333 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.277335 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.283653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd31b711-5af6-4b4e-89eb-65085f9b88e6-kube-api-access-nxmr7" (OuterVolumeSpecName: "kube-api-access-nxmr7") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "kube-api-access-nxmr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.302371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-scripts" (OuterVolumeSpecName: "scripts") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.335290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.383264 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd31b711-5af6-4b4e-89eb-65085f9b88e6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.383307 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.383318 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxmr7\" (UniqueName: \"kubernetes.io/projected/cd31b711-5af6-4b4e-89eb-65085f9b88e6-kube-api-access-nxmr7\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.383328 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.431386 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.457558 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-config-data" (OuterVolumeSpecName: "config-data") pod "cd31b711-5af6-4b4e-89eb-65085f9b88e6" (UID: "cd31b711-5af6-4b4e-89eb-65085f9b88e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.484386 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.484414 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd31b711-5af6-4b4e-89eb-65085f9b88e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.931213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd31b711-5af6-4b4e-89eb-65085f9b88e6","Type":"ContainerDied","Data":"5248fcbd5608431efa6376d7c89682a12e27eb28c50f3d532a1bd866fb827613"} Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.931241 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.931288 4837 scope.go:117] "RemoveContainer" containerID="f6893e44a60a0237de489adef9ca8e9d742555369287b358ef14d7b245eba328" Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.958758 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.977959 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:46 crc kubenswrapper[4837]: I1014 13:20:46.981674 4837 scope.go:117] "RemoveContainer" containerID="50ddee8ada23b5c7cf4616d1026eaf7f820614399d32a171018078b4a94d9a4c" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.004711 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:47 crc kubenswrapper[4837]: E1014 13:20:47.008227 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="sg-core" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.008271 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="sg-core" Oct 14 13:20:47 crc kubenswrapper[4837]: E1014 13:20:47.008302 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-central-agent" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.008315 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-central-agent" Oct 14 13:20:47 crc kubenswrapper[4837]: E1014 13:20:47.008385 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-notification-agent" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.008397 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-notification-agent" Oct 14 13:20:47 crc kubenswrapper[4837]: E1014 13:20:47.008467 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="proxy-httpd" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.008481 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="proxy-httpd" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.009312 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-notification-agent" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.009357 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="ceilometer-central-agent" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.009415 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="proxy-httpd" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.009459 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" containerName="sg-core" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.016715 4837 scope.go:117] "RemoveContainer" containerID="af9c55a9b7255ca55cd99331f2151ad7f9390a493ad77c04f04643b6c5d20198" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.027631 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.027760 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.034700 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.034931 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.035217 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.055496 4837 scope.go:117] "RemoveContainer" containerID="b2a459432b061fe738ecf8e213e4c4f7b47c965d4e77f61bf36ceed898c029ab" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-config-data\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094187 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094212 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-run-httpd\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-log-httpd\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094263 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-scripts\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094379 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkh4\" (UniqueName: \"kubernetes.io/projected/c0583441-b4b9-4511-ab05-9af993b41584-kube-api-access-sqkh4\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.094398 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195719 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-config-data\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195812 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-run-httpd\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-log-httpd\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-scripts\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195892 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkh4\" (UniqueName: \"kubernetes.io/projected/c0583441-b4b9-4511-ab05-9af993b41584-kube-api-access-sqkh4\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.195981 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.197328 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-log-httpd\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.200667 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.201121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-config-data\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.203791 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.204117 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-run-httpd\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.207089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-scripts\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.209412 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.225664 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkh4\" (UniqueName: \"kubernetes.io/projected/c0583441-b4b9-4511-ab05-9af993b41584-kube-api-access-sqkh4\") pod \"ceilometer-0\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.353043 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.810853 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:47 crc kubenswrapper[4837]: W1014 13:20:47.822220 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0583441_b4b9_4511_ab05_9af993b41584.slice/crio-50958b1b1b28be586c161a669959d9bb5d29660e21b65c24eb70b1fc151fd832 WatchSource:0}: Error finding container 50958b1b1b28be586c161a669959d9bb5d29660e21b65c24eb70b1fc151fd832: Status 404 returned error can't find the container with id 50958b1b1b28be586c161a669959d9bb5d29660e21b65c24eb70b1fc151fd832 Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.914701 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.945288 4837 generic.go:334] "Generic (PLEG): container finished" podID="0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" containerID="e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f" exitCode=137 Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.945947 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3","Type":"ContainerDied","Data":"e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f"} Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.948405 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3","Type":"ContainerDied","Data":"87b7265f8e0a541f80847fe4eb0a638e33b791bf7d9e2d818c24ca87916429ba"} Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.948446 4837 scope.go:117] "RemoveContainer" containerID="e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.948700 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.952718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerStarted","Data":"50958b1b1b28be586c161a669959d9bb5d29660e21b65c24eb70b1fc151fd832"} Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.975811 4837 scope.go:117] "RemoveContainer" containerID="e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f" Oct 14 13:20:47 crc kubenswrapper[4837]: E1014 13:20:47.976278 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f\": container with ID starting with e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f not found: ID does not exist" containerID="e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f" Oct 14 13:20:47 crc kubenswrapper[4837]: I1014 13:20:47.976321 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f"} err="failed to get container status \"e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f\": rpc error: code = NotFound desc = could not find container \"e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f\": container with ID starting with e7a640a51de34b6aa8037a41dd752121095352beff79a2143007c18fb548c52f not found: ID does not exist" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.009706 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5mzq\" (UniqueName: \"kubernetes.io/projected/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-kube-api-access-n5mzq\") pod \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.009758 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-combined-ca-bundle\") pod \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.009889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-config-data\") pod \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\" (UID: \"0cb1b7f4-168e-48ec-a86f-58b2d40bdde3\") " Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.027051 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-kube-api-access-n5mzq" (OuterVolumeSpecName: "kube-api-access-n5mzq") pod "0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" (UID: "0cb1b7f4-168e-48ec-a86f-58b2d40bdde3"). InnerVolumeSpecName "kube-api-access-n5mzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.036571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" (UID: "0cb1b7f4-168e-48ec-a86f-58b2d40bdde3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.053598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-config-data" (OuterVolumeSpecName: "config-data") pod "0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" (UID: "0cb1b7f4-168e-48ec-a86f-58b2d40bdde3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.113897 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.113935 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5mzq\" (UniqueName: \"kubernetes.io/projected/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-kube-api-access-n5mzq\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.113948 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.291323 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.303833 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.314734 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:48 crc kubenswrapper[4837]: E1014 13:20:48.315232 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" containerName="nova-cell1-novncproxy-novncproxy" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.315257 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" containerName="nova-cell1-novncproxy-novncproxy" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.315546 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" containerName="nova-cell1-novncproxy-novncproxy" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.316326 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.318364 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.319449 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.319756 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.323403 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.399950 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.400386 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.401089 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.403629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.418028 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.418103 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jrz\" (UniqueName: \"kubernetes.io/projected/abb35cf2-796d-40bc-8b6b-d421dec44645-kube-api-access-m6jrz\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.418177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.418240 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.418270 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.520550 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.520991 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.521099 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.521224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jrz\" (UniqueName: \"kubernetes.io/projected/abb35cf2-796d-40bc-8b6b-d421dec44645-kube-api-access-m6jrz\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.521379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.525335 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.525773 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.526354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.526499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb35cf2-796d-40bc-8b6b-d421dec44645-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.544089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jrz\" (UniqueName: \"kubernetes.io/projected/abb35cf2-796d-40bc-8b6b-d421dec44645-kube-api-access-m6jrz\") pod \"nova-cell1-novncproxy-0\" (UID: \"abb35cf2-796d-40bc-8b6b-d421dec44645\") " pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.637279 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.806373 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb1b7f4-168e-48ec-a86f-58b2d40bdde3" path="/var/lib/kubelet/pods/0cb1b7f4-168e-48ec-a86f-58b2d40bdde3/volumes" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.807397 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd31b711-5af6-4b4e-89eb-65085f9b88e6" path="/var/lib/kubelet/pods/cd31b711-5af6-4b4e-89eb-65085f9b88e6/volumes" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.966063 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerStarted","Data":"6e63a933b4cda16f71345efd0054206b6c8b6b5e6fdf8067785c4fe5d11068b8"} Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.966416 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:20:48 crc kubenswrapper[4837]: I1014 13:20:48.970621 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.121331 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 14 13:20:49 crc kubenswrapper[4837]: W1014 13:20:49.128755 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabb35cf2_796d_40bc_8b6b_d421dec44645.slice/crio-3b40db20b1592fdb9916dbcf50916b20faea7439e62d8b476b213c866053afe5 WatchSource:0}: Error finding container 3b40db20b1592fdb9916dbcf50916b20faea7439e62d8b476b213c866053afe5: Status 404 returned error can't find the container with id 3b40db20b1592fdb9916dbcf50916b20faea7439e62d8b476b213c866053afe5 Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.181286 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6hcqh"] Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.182876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.223778 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6hcqh"] Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.242297 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-config\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.242532 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.242592 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.242611 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.242654 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59pff\" (UniqueName: \"kubernetes.io/projected/e35cc05c-bae2-4693-88ff-2e28dce40010-kube-api-access-59pff\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.242696 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.344170 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59pff\" (UniqueName: \"kubernetes.io/projected/e35cc05c-bae2-4693-88ff-2e28dce40010-kube-api-access-59pff\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.344236 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.344295 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-config\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.344326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.344386 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.344407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.345400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.345473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-config\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.345493 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.345907 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.346096 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.369885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59pff\" (UniqueName: \"kubernetes.io/projected/e35cc05c-bae2-4693-88ff-2e28dce40010-kube-api-access-59pff\") pod \"dnsmasq-dns-5c7b6c5df9-6hcqh\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:49 crc kubenswrapper[4837]: I1014 13:20:49.554480 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:50 crc kubenswrapper[4837]: I1014 13:20:50.002648 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"abb35cf2-796d-40bc-8b6b-d421dec44645","Type":"ContainerStarted","Data":"7e64c05155f220d5a158ac72c86dc26a2c5b8db237c274cfa1cbfaa8270944ee"} Oct 14 13:20:50 crc kubenswrapper[4837]: I1014 13:20:50.003012 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"abb35cf2-796d-40bc-8b6b-d421dec44645","Type":"ContainerStarted","Data":"3b40db20b1592fdb9916dbcf50916b20faea7439e62d8b476b213c866053afe5"} Oct 14 13:20:50 crc kubenswrapper[4837]: I1014 13:20:50.013261 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerStarted","Data":"f3031faf1f64a0900e8660aff3f87f39742aeac87615e2fb1e2d76a61889354e"} Oct 14 13:20:50 crc kubenswrapper[4837]: I1014 13:20:50.045257 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.045234811 podStartE2EDuration="2.045234811s" podCreationTimestamp="2025-10-14 13:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:50.039541898 +0000 UTC m=+1187.956541711" watchObservedRunningTime="2025-10-14 13:20:50.045234811 +0000 UTC m=+1187.962234624" Oct 14 13:20:50 crc kubenswrapper[4837]: I1014 13:20:50.158356 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6hcqh"] Oct 14 13:20:51 crc kubenswrapper[4837]: I1014 13:20:51.024488 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerID="2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3" exitCode=0 Oct 14 13:20:51 crc kubenswrapper[4837]: I1014 13:20:51.024550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" event={"ID":"e35cc05c-bae2-4693-88ff-2e28dce40010","Type":"ContainerDied","Data":"2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3"} Oct 14 13:20:51 crc kubenswrapper[4837]: I1014 13:20:51.025113 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" event={"ID":"e35cc05c-bae2-4693-88ff-2e28dce40010","Type":"ContainerStarted","Data":"5acbeef248dbee38ff327852d6bbabf2c549f3994f3e52abbc736f8c8a3ed34f"} Oct 14 13:20:51 crc kubenswrapper[4837]: I1014 13:20:51.029366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerStarted","Data":"de0daae2b79dda7cab58fb01700804988a0bcf60733c159a55ae50bb5786f4d9"} Oct 14 13:20:51 crc kubenswrapper[4837]: I1014 13:20:51.294335 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 14 13:20:51 crc kubenswrapper[4837]: I1014 13:20:51.963110 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:52 crc kubenswrapper[4837]: I1014 13:20:52.040638 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" event={"ID":"e35cc05c-bae2-4693-88ff-2e28dce40010","Type":"ContainerStarted","Data":"9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25"} Oct 14 13:20:52 crc kubenswrapper[4837]: I1014 13:20:52.040793 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-log" containerID="cri-o://95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1" gracePeriod=30 Oct 14 13:20:52 crc kubenswrapper[4837]: I1014 13:20:52.040851 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-api" containerID="cri-o://474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e" gracePeriod=30 Oct 14 13:20:52 crc kubenswrapper[4837]: I1014 13:20:52.064357 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" podStartSLOduration=3.064341492 podStartE2EDuration="3.064341492s" podCreationTimestamp="2025-10-14 13:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:52.060971952 +0000 UTC m=+1189.977971785" watchObservedRunningTime="2025-10-14 13:20:52.064341492 +0000 UTC m=+1189.981341305" Oct 14 13:20:52 crc kubenswrapper[4837]: I1014 13:20:52.760738 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.104546 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerStarted","Data":"5324380b2c37d29aa786799461fc5a5014698bda9d1dd0f43b74cae1e793d6fe"} Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.108018 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.151579 4837 generic.go:334] "Generic (PLEG): container finished" podID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerID="95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1" exitCode=143 Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.152230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ad51b65-0353-4bd6-8bb9-b5cc7943e934","Type":"ContainerDied","Data":"95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1"} Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.152845 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.153812 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.91513571 podStartE2EDuration="7.153795674s" podCreationTimestamp="2025-10-14 13:20:46 +0000 UTC" firstStartedPulling="2025-10-14 13:20:47.82535933 +0000 UTC m=+1185.742359143" lastFinishedPulling="2025-10-14 13:20:52.064019294 +0000 UTC m=+1189.981019107" observedRunningTime="2025-10-14 13:20:53.152568711 +0000 UTC m=+1191.069568524" watchObservedRunningTime="2025-10-14 13:20:53.153795674 +0000 UTC m=+1191.070795487" Oct 14 13:20:53 crc kubenswrapper[4837]: I1014 13:20:53.637496 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:54 crc kubenswrapper[4837]: I1014 13:20:54.160801 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-central-agent" containerID="cri-o://6e63a933b4cda16f71345efd0054206b6c8b6b5e6fdf8067785c4fe5d11068b8" gracePeriod=30 Oct 14 13:20:54 crc kubenswrapper[4837]: I1014 13:20:54.161131 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="proxy-httpd" containerID="cri-o://5324380b2c37d29aa786799461fc5a5014698bda9d1dd0f43b74cae1e793d6fe" gracePeriod=30 Oct 14 13:20:54 crc kubenswrapper[4837]: I1014 13:20:54.161085 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="sg-core" containerID="cri-o://de0daae2b79dda7cab58fb01700804988a0bcf60733c159a55ae50bb5786f4d9" gracePeriod=30 Oct 14 13:20:54 crc kubenswrapper[4837]: I1014 13:20:54.161117 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-notification-agent" containerID="cri-o://f3031faf1f64a0900e8660aff3f87f39742aeac87615e2fb1e2d76a61889354e" gracePeriod=30 Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.172819 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0583441-b4b9-4511-ab05-9af993b41584" containerID="5324380b2c37d29aa786799461fc5a5014698bda9d1dd0f43b74cae1e793d6fe" exitCode=0 Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.173154 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0583441-b4b9-4511-ab05-9af993b41584" containerID="de0daae2b79dda7cab58fb01700804988a0bcf60733c159a55ae50bb5786f4d9" exitCode=2 Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.173189 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0583441-b4b9-4511-ab05-9af993b41584" containerID="f3031faf1f64a0900e8660aff3f87f39742aeac87615e2fb1e2d76a61889354e" exitCode=0 Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.172886 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerDied","Data":"5324380b2c37d29aa786799461fc5a5014698bda9d1dd0f43b74cae1e793d6fe"} Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.173233 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerDied","Data":"de0daae2b79dda7cab58fb01700804988a0bcf60733c159a55ae50bb5786f4d9"} Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.173251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerDied","Data":"f3031faf1f64a0900e8660aff3f87f39742aeac87615e2fb1e2d76a61889354e"} Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.706552 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.774219 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-combined-ca-bundle\") pod \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.774424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-logs\") pod \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.774520 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-config-data\") pod \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.774607 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnl69\" (UniqueName: \"kubernetes.io/projected/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-kube-api-access-tnl69\") pod \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\" (UID: \"0ad51b65-0353-4bd6-8bb9-b5cc7943e934\") " Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.776333 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-logs" (OuterVolumeSpecName: "logs") pod "0ad51b65-0353-4bd6-8bb9-b5cc7943e934" (UID: "0ad51b65-0353-4bd6-8bb9-b5cc7943e934"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.781875 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-kube-api-access-tnl69" (OuterVolumeSpecName: "kube-api-access-tnl69") pod "0ad51b65-0353-4bd6-8bb9-b5cc7943e934" (UID: "0ad51b65-0353-4bd6-8bb9-b5cc7943e934"). InnerVolumeSpecName "kube-api-access-tnl69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.805848 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ad51b65-0353-4bd6-8bb9-b5cc7943e934" (UID: "0ad51b65-0353-4bd6-8bb9-b5cc7943e934"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.810175 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-config-data" (OuterVolumeSpecName: "config-data") pod "0ad51b65-0353-4bd6-8bb9-b5cc7943e934" (UID: "0ad51b65-0353-4bd6-8bb9-b5cc7943e934"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.876577 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.876624 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.876639 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnl69\" (UniqueName: \"kubernetes.io/projected/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-kube-api-access-tnl69\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:55 crc kubenswrapper[4837]: I1014 13:20:55.876653 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad51b65-0353-4bd6-8bb9-b5cc7943e934-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.189477 4837 generic.go:334] "Generic (PLEG): container finished" podID="c0583441-b4b9-4511-ab05-9af993b41584" containerID="6e63a933b4cda16f71345efd0054206b6c8b6b5e6fdf8067785c4fe5d11068b8" exitCode=0 Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.189573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerDied","Data":"6e63a933b4cda16f71345efd0054206b6c8b6b5e6fdf8067785c4fe5d11068b8"} Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.193310 4837 generic.go:334] "Generic (PLEG): container finished" podID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerID="474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e" exitCode=0 Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.193367 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.193362 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ad51b65-0353-4bd6-8bb9-b5cc7943e934","Type":"ContainerDied","Data":"474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e"} Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.195536 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ad51b65-0353-4bd6-8bb9-b5cc7943e934","Type":"ContainerDied","Data":"e9b41635a5e7dc6379f74f890cb7d6299f4ac25bf81c303ba43a52316bd09534"} Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.195563 4837 scope.go:117] "RemoveContainer" containerID="474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.237555 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.244116 4837 scope.go:117] "RemoveContainer" containerID="95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.256267 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.267622 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:56 crc kubenswrapper[4837]: E1014 13:20:56.268234 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-api" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.268264 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-api" Oct 14 13:20:56 crc kubenswrapper[4837]: E1014 13:20:56.268333 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-log" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.268346 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-log" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.268642 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-log" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.268682 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" containerName="nova-api-api" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.270415 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.274209 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.275061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.277384 4837 scope.go:117] "RemoveContainer" containerID="474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.279098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 14 13:20:56 crc kubenswrapper[4837]: E1014 13:20:56.287545 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e\": container with ID starting with 474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e not found: ID does not exist" containerID="474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.287592 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e"} err="failed to get container status \"474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e\": rpc error: code = NotFound desc = could not find container \"474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e\": container with ID starting with 474f3ebab6f0ffb6529883135dfb0e029754e4a61e08a963e09e6350330cf44e not found: ID does not exist" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.287621 4837 scope.go:117] "RemoveContainer" containerID="95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1" Oct 14 13:20:56 crc kubenswrapper[4837]: E1014 13:20:56.289181 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1\": container with ID starting with 95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1 not found: ID does not exist" containerID="95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.289222 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1"} err="failed to get container status \"95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1\": rpc error: code = NotFound desc = could not find container \"95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1\": container with ID starting with 95025b871d027668b3176c9675b3d80971e3413e18ce43c5b7f1ec27fed23af1 not found: ID does not exist" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.299745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.307094 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.384959 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-config-data\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.385252 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-sg-core-conf-yaml\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.385361 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-combined-ca-bundle\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.385529 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-scripts\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-run-httpd\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386220 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-log-httpd\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386320 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-ceilometer-tls-certs\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqkh4\" (UniqueName: \"kubernetes.io/projected/c0583441-b4b9-4511-ab05-9af993b41584-kube-api-access-sqkh4\") pod \"c0583441-b4b9-4511-ab05-9af993b41584\" (UID: \"c0583441-b4b9-4511-ab05-9af993b41584\") " Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386632 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.386947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-config-data\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387199 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-public-tls-certs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93541fa5-f5c5-459f-9419-7ff360db2ae6-logs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bd8b\" (UniqueName: \"kubernetes.io/projected/93541fa5-f5c5-459f-9419-7ff360db2ae6-kube-api-access-2bd8b\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387666 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.387736 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0583441-b4b9-4511-ab05-9af993b41584-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.390082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0583441-b4b9-4511-ab05-9af993b41584-kube-api-access-sqkh4" (OuterVolumeSpecName: "kube-api-access-sqkh4") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "kube-api-access-sqkh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.402873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-scripts" (OuterVolumeSpecName: "scripts") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.425181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.440349 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.459854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489452 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93541fa5-f5c5-459f-9419-7ff360db2ae6-logs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489523 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bd8b\" (UniqueName: \"kubernetes.io/projected/93541fa5-f5c5-459f-9419-7ff360db2ae6-kube-api-access-2bd8b\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489641 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-config-data\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-public-tls-certs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489840 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489858 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489871 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqkh4\" (UniqueName: \"kubernetes.io/projected/c0583441-b4b9-4511-ab05-9af993b41584-kube-api-access-sqkh4\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489883 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489895 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.489906 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93541fa5-f5c5-459f-9419-7ff360db2ae6-logs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.493201 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.495457 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-public-tls-certs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.502099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.502875 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-config-data\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.504307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-config-data" (OuterVolumeSpecName: "config-data") pod "c0583441-b4b9-4511-ab05-9af993b41584" (UID: "c0583441-b4b9-4511-ab05-9af993b41584"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.513320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bd8b\" (UniqueName: \"kubernetes.io/projected/93541fa5-f5c5-459f-9419-7ff360db2ae6-kube-api-access-2bd8b\") pod \"nova-api-0\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.591376 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0583441-b4b9-4511-ab05-9af993b41584-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.597975 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:20:56 crc kubenswrapper[4837]: I1014 13:20:56.814605 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad51b65-0353-4bd6-8bb9-b5cc7943e934" path="/var/lib/kubelet/pods/0ad51b65-0353-4bd6-8bb9-b5cc7943e934/volumes" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.029583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:20:57 crc kubenswrapper[4837]: W1014 13:20:57.033934 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93541fa5_f5c5_459f_9419_7ff360db2ae6.slice/crio-2a3b32c365eae006bc31fd65558eff85658cd2731cfd725b115cd755d155e671 WatchSource:0}: Error finding container 2a3b32c365eae006bc31fd65558eff85658cd2731cfd725b115cd755d155e671: Status 404 returned error can't find the container with id 2a3b32c365eae006bc31fd65558eff85658cd2731cfd725b115cd755d155e671 Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.205962 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93541fa5-f5c5-459f-9419-7ff360db2ae6","Type":"ContainerStarted","Data":"2a3b32c365eae006bc31fd65558eff85658cd2731cfd725b115cd755d155e671"} Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.216918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0583441-b4b9-4511-ab05-9af993b41584","Type":"ContainerDied","Data":"50958b1b1b28be586c161a669959d9bb5d29660e21b65c24eb70b1fc151fd832"} Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.216959 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.217002 4837 scope.go:117] "RemoveContainer" containerID="5324380b2c37d29aa786799461fc5a5014698bda9d1dd0f43b74cae1e793d6fe" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.246512 4837 scope.go:117] "RemoveContainer" containerID="de0daae2b79dda7cab58fb01700804988a0bcf60733c159a55ae50bb5786f4d9" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.246623 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.256649 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.279907 4837 scope.go:117] "RemoveContainer" containerID="f3031faf1f64a0900e8660aff3f87f39742aeac87615e2fb1e2d76a61889354e" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283063 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:57 crc kubenswrapper[4837]: E1014 13:20:57.283559 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-central-agent" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283585 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-central-agent" Oct 14 13:20:57 crc kubenswrapper[4837]: E1014 13:20:57.283602 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-notification-agent" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283611 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-notification-agent" Oct 14 13:20:57 crc kubenswrapper[4837]: E1014 13:20:57.283633 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="proxy-httpd" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283642 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="proxy-httpd" Oct 14 13:20:57 crc kubenswrapper[4837]: E1014 13:20:57.283653 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="sg-core" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283661 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="sg-core" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283860 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="sg-core" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283877 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-notification-agent" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283897 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="ceilometer-central-agent" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.283904 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0583441-b4b9-4511-ab05-9af993b41584" containerName="proxy-httpd" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.285639 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.289883 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.290100 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.290241 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.299099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.438574 4837 scope.go:117] "RemoveContainer" containerID="6e63a933b4cda16f71345efd0054206b6c8b6b5e6fdf8067785c4fe5d11068b8" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79be4f00-8769-4d3c-aa9c-a1bb24787668-run-httpd\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-config-data\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-scripts\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478812 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478836 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8f8g\" (UniqueName: \"kubernetes.io/projected/79be4f00-8769-4d3c-aa9c-a1bb24787668-kube-api-access-t8f8g\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478923 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.478962 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79be4f00-8769-4d3c-aa9c-a1bb24787668-log-httpd\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.479019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79be4f00-8769-4d3c-aa9c-a1bb24787668-run-httpd\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581320 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-config-data\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-scripts\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581416 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8f8g\" (UniqueName: \"kubernetes.io/projected/79be4f00-8769-4d3c-aa9c-a1bb24787668-kube-api-access-t8f8g\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581594 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79be4f00-8769-4d3c-aa9c-a1bb24787668-log-httpd\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581661 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.581800 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79be4f00-8769-4d3c-aa9c-a1bb24787668-run-httpd\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.584557 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.585018 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79be4f00-8769-4d3c-aa9c-a1bb24787668-log-httpd\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.587098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.587421 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.587781 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-config-data\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.589675 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79be4f00-8769-4d3c-aa9c-a1bb24787668-scripts\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.605339 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8f8g\" (UniqueName: \"kubernetes.io/projected/79be4f00-8769-4d3c-aa9c-a1bb24787668-kube-api-access-t8f8g\") pod \"ceilometer-0\" (UID: \"79be4f00-8769-4d3c-aa9c-a1bb24787668\") " pod="openstack/ceilometer-0" Oct 14 13:20:57 crc kubenswrapper[4837]: I1014 13:20:57.739315 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.189218 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:20:58 crc kubenswrapper[4837]: W1014 13:20:58.191268 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79be4f00_8769_4d3c_aa9c_a1bb24787668.slice/crio-f8dc40da44901825bafa5f33b0fea8667a2e92656f3f1c47bd53d22423fc08b1 WatchSource:0}: Error finding container f8dc40da44901825bafa5f33b0fea8667a2e92656f3f1c47bd53d22423fc08b1: Status 404 returned error can't find the container with id f8dc40da44901825bafa5f33b0fea8667a2e92656f3f1c47bd53d22423fc08b1 Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.232328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93541fa5-f5c5-459f-9419-7ff360db2ae6","Type":"ContainerStarted","Data":"cedd08263862b957c94afc78594a626fef41b1663872e6c90cd89a98fb324bde"} Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.232374 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93541fa5-f5c5-459f-9419-7ff360db2ae6","Type":"ContainerStarted","Data":"30daba50566b7493b19d2cbd96c67111db75a0982b4c280911425493678cf7ac"} Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.235128 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79be4f00-8769-4d3c-aa9c-a1bb24787668","Type":"ContainerStarted","Data":"f8dc40da44901825bafa5f33b0fea8667a2e92656f3f1c47bd53d22423fc08b1"} Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.259187 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.2591478130000002 podStartE2EDuration="2.259147813s" podCreationTimestamp="2025-10-14 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:20:58.249615927 +0000 UTC m=+1196.166615740" watchObservedRunningTime="2025-10-14 13:20:58.259147813 +0000 UTC m=+1196.176147626" Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.638347 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.658685 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:58 crc kubenswrapper[4837]: I1014 13:20:58.806543 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0583441-b4b9-4511-ab05-9af993b41584" path="/var/lib/kubelet/pods/c0583441-b4b9-4511-ab05-9af993b41584/volumes" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.249507 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79be4f00-8769-4d3c-aa9c-a1bb24787668","Type":"ContainerStarted","Data":"6b86215805ebb2550dde144fe013dddf5601e9e177c7618711b0b1066ece9fcb"} Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.279899 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.431077 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5hkhc"] Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.433611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.438309 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.438523 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.445090 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5hkhc"] Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.556679 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.625834 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-scripts\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.625992 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.626062 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-config-data\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.626793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5kf\" (UniqueName: \"kubernetes.io/projected/400438cc-a9d6-4683-8eeb-df32774fc5a5-kube-api-access-dz5kf\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.641280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-np2rl"] Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.641563 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerName="dnsmasq-dns" containerID="cri-o://8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8" gracePeriod=10 Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.734526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-config-data\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.734583 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5kf\" (UniqueName: \"kubernetes.io/projected/400438cc-a9d6-4683-8eeb-df32774fc5a5-kube-api-access-dz5kf\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.734689 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-scripts\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.734783 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.738718 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-config-data\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.739950 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-scripts\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.740265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.753808 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5kf\" (UniqueName: \"kubernetes.io/projected/400438cc-a9d6-4683-8eeb-df32774fc5a5-kube-api-access-dz5kf\") pod \"nova-cell1-cell-mapping-5hkhc\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:20:59 crc kubenswrapper[4837]: I1014 13:20:59.801683 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.102733 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.143779 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4tk6\" (UniqueName: \"kubernetes.io/projected/5dd5cf86-df02-4983-86f2-430028f1d02d-kube-api-access-z4tk6\") pod \"5dd5cf86-df02-4983-86f2-430028f1d02d\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.143843 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-svc\") pod \"5dd5cf86-df02-4983-86f2-430028f1d02d\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.143970 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-nb\") pod \"5dd5cf86-df02-4983-86f2-430028f1d02d\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.144012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-config\") pod \"5dd5cf86-df02-4983-86f2-430028f1d02d\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.144041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-sb\") pod \"5dd5cf86-df02-4983-86f2-430028f1d02d\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.144102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-swift-storage-0\") pod \"5dd5cf86-df02-4983-86f2-430028f1d02d\" (UID: \"5dd5cf86-df02-4983-86f2-430028f1d02d\") " Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.152536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd5cf86-df02-4983-86f2-430028f1d02d-kube-api-access-z4tk6" (OuterVolumeSpecName: "kube-api-access-z4tk6") pod "5dd5cf86-df02-4983-86f2-430028f1d02d" (UID: "5dd5cf86-df02-4983-86f2-430028f1d02d"). InnerVolumeSpecName "kube-api-access-z4tk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.197815 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5dd5cf86-df02-4983-86f2-430028f1d02d" (UID: "5dd5cf86-df02-4983-86f2-430028f1d02d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.214708 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-config" (OuterVolumeSpecName: "config") pod "5dd5cf86-df02-4983-86f2-430028f1d02d" (UID: "5dd5cf86-df02-4983-86f2-430028f1d02d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.219878 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5dd5cf86-df02-4983-86f2-430028f1d02d" (UID: "5dd5cf86-df02-4983-86f2-430028f1d02d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.221103 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5dd5cf86-df02-4983-86f2-430028f1d02d" (UID: "5dd5cf86-df02-4983-86f2-430028f1d02d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.234813 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5dd5cf86-df02-4983-86f2-430028f1d02d" (UID: "5dd5cf86-df02-4983-86f2-430028f1d02d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.246291 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.246336 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.246352 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4tk6\" (UniqueName: \"kubernetes.io/projected/5dd5cf86-df02-4983-86f2-430028f1d02d-kube-api-access-z4tk6\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.246369 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.246382 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.246394 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dd5cf86-df02-4983-86f2-430028f1d02d-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.260411 4837 generic.go:334] "Generic (PLEG): container finished" podID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerID="8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8" exitCode=0 Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.260513 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" event={"ID":"5dd5cf86-df02-4983-86f2-430028f1d02d","Type":"ContainerDied","Data":"8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8"} Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.260539 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" event={"ID":"5dd5cf86-df02-4983-86f2-430028f1d02d","Type":"ContainerDied","Data":"e6023f3545fd4db942d55f8bfd1c1334f69ef458ce5ffc1d57b1af753871404f"} Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.260555 4837 scope.go:117] "RemoveContainer" containerID="8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.261758 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-np2rl" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.264527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79be4f00-8769-4d3c-aa9c-a1bb24787668","Type":"ContainerStarted","Data":"2fdbc3c76c38a0a0762babd92cb7dae873ea5540d127ee3d45957adaa32503a3"} Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.282246 4837 scope.go:117] "RemoveContainer" containerID="fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.306357 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-np2rl"] Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.313477 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-np2rl"] Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.326085 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5hkhc"] Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.326954 4837 scope.go:117] "RemoveContainer" containerID="8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8" Oct 14 13:21:00 crc kubenswrapper[4837]: E1014 13:21:00.328018 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8\": container with ID starting with 8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8 not found: ID does not exist" containerID="8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.328078 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8"} err="failed to get container status \"8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8\": rpc error: code = NotFound desc = could not find container \"8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8\": container with ID starting with 8d346070b63d559e0997055b4ec7b06bac1acea39bb251e37882e9e602df66b8 not found: ID does not exist" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.328108 4837 scope.go:117] "RemoveContainer" containerID="fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88" Oct 14 13:21:00 crc kubenswrapper[4837]: E1014 13:21:00.328496 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88\": container with ID starting with fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88 not found: ID does not exist" containerID="fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88" Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.328520 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88"} err="failed to get container status \"fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88\": rpc error: code = NotFound desc = could not find container \"fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88\": container with ID starting with fabddacc257b7e33256b8b7c32faf48b4423e8c8417d94a1a0cfe255e4b18b88 not found: ID does not exist" Oct 14 13:21:00 crc kubenswrapper[4837]: W1014 13:21:00.331634 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod400438cc_a9d6_4683_8eeb_df32774fc5a5.slice/crio-a226612e150ab709e3ff31a6ab447456feab4324cbacfb6eca787823cbb2e870 WatchSource:0}: Error finding container a226612e150ab709e3ff31a6ab447456feab4324cbacfb6eca787823cbb2e870: Status 404 returned error can't find the container with id a226612e150ab709e3ff31a6ab447456feab4324cbacfb6eca787823cbb2e870 Oct 14 13:21:00 crc kubenswrapper[4837]: I1014 13:21:00.795200 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" path="/var/lib/kubelet/pods/5dd5cf86-df02-4983-86f2-430028f1d02d/volumes" Oct 14 13:21:01 crc kubenswrapper[4837]: I1014 13:21:01.275130 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79be4f00-8769-4d3c-aa9c-a1bb24787668","Type":"ContainerStarted","Data":"abb5255cb50a891deed69d985bd6e2af122e136c4971b3f4654a8132c68511b7"} Oct 14 13:21:01 crc kubenswrapper[4837]: I1014 13:21:01.277533 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5hkhc" event={"ID":"400438cc-a9d6-4683-8eeb-df32774fc5a5","Type":"ContainerStarted","Data":"6ca09ff19002cf33eee6fe24b5ee12596b16e0baba3200cea57a856f2e0becc1"} Oct 14 13:21:01 crc kubenswrapper[4837]: I1014 13:21:01.277568 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5hkhc" event={"ID":"400438cc-a9d6-4683-8eeb-df32774fc5a5","Type":"ContainerStarted","Data":"a226612e150ab709e3ff31a6ab447456feab4324cbacfb6eca787823cbb2e870"} Oct 14 13:21:02 crc kubenswrapper[4837]: I1014 13:21:02.832753 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5hkhc" podStartSLOduration=3.832730368 podStartE2EDuration="3.832730368s" podCreationTimestamp="2025-10-14 13:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:01.297404253 +0000 UTC m=+1199.214404086" watchObservedRunningTime="2025-10-14 13:21:02.832730368 +0000 UTC m=+1200.749730181" Oct 14 13:21:03 crc kubenswrapper[4837]: I1014 13:21:03.321520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79be4f00-8769-4d3c-aa9c-a1bb24787668","Type":"ContainerStarted","Data":"35c6740ede578c8cffca2a250769e751dac9bd789dceddc4d6e0e2d7240bea94"} Oct 14 13:21:03 crc kubenswrapper[4837]: I1014 13:21:03.321778 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:21:03 crc kubenswrapper[4837]: I1014 13:21:03.351334 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.382716767 podStartE2EDuration="6.351297558s" podCreationTimestamp="2025-10-14 13:20:57 +0000 UTC" firstStartedPulling="2025-10-14 13:20:58.193437352 +0000 UTC m=+1196.110437165" lastFinishedPulling="2025-10-14 13:21:02.162018143 +0000 UTC m=+1200.079017956" observedRunningTime="2025-10-14 13:21:03.350124307 +0000 UTC m=+1201.267124140" watchObservedRunningTime="2025-10-14 13:21:03.351297558 +0000 UTC m=+1201.268297371" Oct 14 13:21:05 crc kubenswrapper[4837]: I1014 13:21:05.345055 4837 generic.go:334] "Generic (PLEG): container finished" podID="400438cc-a9d6-4683-8eeb-df32774fc5a5" containerID="6ca09ff19002cf33eee6fe24b5ee12596b16e0baba3200cea57a856f2e0becc1" exitCode=0 Oct 14 13:21:05 crc kubenswrapper[4837]: I1014 13:21:05.345119 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5hkhc" event={"ID":"400438cc-a9d6-4683-8eeb-df32774fc5a5","Type":"ContainerDied","Data":"6ca09ff19002cf33eee6fe24b5ee12596b16e0baba3200cea57a856f2e0becc1"} Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.598546 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.598737 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.720200 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.869936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-combined-ca-bundle\") pod \"400438cc-a9d6-4683-8eeb-df32774fc5a5\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.870321 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-config-data\") pod \"400438cc-a9d6-4683-8eeb-df32774fc5a5\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.870483 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-scripts\") pod \"400438cc-a9d6-4683-8eeb-df32774fc5a5\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.870531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz5kf\" (UniqueName: \"kubernetes.io/projected/400438cc-a9d6-4683-8eeb-df32774fc5a5-kube-api-access-dz5kf\") pod \"400438cc-a9d6-4683-8eeb-df32774fc5a5\" (UID: \"400438cc-a9d6-4683-8eeb-df32774fc5a5\") " Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.885581 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400438cc-a9d6-4683-8eeb-df32774fc5a5-kube-api-access-dz5kf" (OuterVolumeSpecName: "kube-api-access-dz5kf") pod "400438cc-a9d6-4683-8eeb-df32774fc5a5" (UID: "400438cc-a9d6-4683-8eeb-df32774fc5a5"). InnerVolumeSpecName "kube-api-access-dz5kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.885526 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-scripts" (OuterVolumeSpecName: "scripts") pod "400438cc-a9d6-4683-8eeb-df32774fc5a5" (UID: "400438cc-a9d6-4683-8eeb-df32774fc5a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.899136 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "400438cc-a9d6-4683-8eeb-df32774fc5a5" (UID: "400438cc-a9d6-4683-8eeb-df32774fc5a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.905256 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-config-data" (OuterVolumeSpecName: "config-data") pod "400438cc-a9d6-4683-8eeb-df32774fc5a5" (UID: "400438cc-a9d6-4683-8eeb-df32774fc5a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.973145 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.973203 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.973216 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz5kf\" (UniqueName: \"kubernetes.io/projected/400438cc-a9d6-4683-8eeb-df32774fc5a5-kube-api-access-dz5kf\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:06 crc kubenswrapper[4837]: I1014 13:21:06.973227 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/400438cc-a9d6-4683-8eeb-df32774fc5a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.366269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5hkhc" event={"ID":"400438cc-a9d6-4683-8eeb-df32774fc5a5","Type":"ContainerDied","Data":"a226612e150ab709e3ff31a6ab447456feab4324cbacfb6eca787823cbb2e870"} Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.366318 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a226612e150ab709e3ff31a6ab447456feab4324cbacfb6eca787823cbb2e870" Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.366353 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5hkhc" Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.555586 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.555791 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-log" containerID="cri-o://30daba50566b7493b19d2cbd96c67111db75a0982b4c280911425493678cf7ac" gracePeriod=30 Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.556187 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-api" containerID="cri-o://cedd08263862b957c94afc78594a626fef41b1663872e6c90cd89a98fb324bde" gracePeriod=30 Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.564205 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": EOF" Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.570038 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.570277 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" containerName="nova-scheduler-scheduler" containerID="cri-o://3005aebb5857896971649463f1bfd5e6d34b881d8dd10c31dd6936c0be8c2bde" gracePeriod=30 Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.578711 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": EOF" Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.598922 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.600282 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-log" containerID="cri-o://91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2" gracePeriod=30 Oct 14 13:21:07 crc kubenswrapper[4837]: I1014 13:21:07.600318 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-metadata" containerID="cri-o://51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e" gracePeriod=30 Oct 14 13:21:08 crc kubenswrapper[4837]: I1014 13:21:08.381809 4837 generic.go:334] "Generic (PLEG): container finished" podID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerID="30daba50566b7493b19d2cbd96c67111db75a0982b4c280911425493678cf7ac" exitCode=143 Oct 14 13:21:08 crc kubenswrapper[4837]: I1014 13:21:08.381882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93541fa5-f5c5-459f-9419-7ff360db2ae6","Type":"ContainerDied","Data":"30daba50566b7493b19d2cbd96c67111db75a0982b4c280911425493678cf7ac"} Oct 14 13:21:08 crc kubenswrapper[4837]: I1014 13:21:08.384093 4837 generic.go:334] "Generic (PLEG): container finished" podID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerID="91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2" exitCode=143 Oct 14 13:21:08 crc kubenswrapper[4837]: I1014 13:21:08.384140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90a4e22d-dd23-427e-adbe-06cc729b517d","Type":"ContainerDied","Data":"91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2"} Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.394315 4837 generic.go:334] "Generic (PLEG): container finished" podID="bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" containerID="3005aebb5857896971649463f1bfd5e6d34b881d8dd10c31dd6936c0be8c2bde" exitCode=0 Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.394413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a","Type":"ContainerDied","Data":"3005aebb5857896971649463f1bfd5e6d34b881d8dd10c31dd6936c0be8c2bde"} Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.394660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a","Type":"ContainerDied","Data":"4f73919796828e692b212384610081075c1398b8a54aba28020793a05a92e5a2"} Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.394676 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f73919796828e692b212384610081075c1398b8a54aba28020793a05a92e5a2" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.464755 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.621838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-config-data\") pod \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.622082 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-combined-ca-bundle\") pod \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.622139 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6cfl\" (UniqueName: \"kubernetes.io/projected/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-kube-api-access-j6cfl\") pod \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\" (UID: \"bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a\") " Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.636708 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-kube-api-access-j6cfl" (OuterVolumeSpecName: "kube-api-access-j6cfl") pod "bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" (UID: "bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a"). InnerVolumeSpecName "kube-api-access-j6cfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.666548 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" (UID: "bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.667521 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-config-data" (OuterVolumeSpecName: "config-data") pod "bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" (UID: "bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.724114 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.724149 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:09 crc kubenswrapper[4837]: I1014 13:21:09.724173 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6cfl\" (UniqueName: \"kubernetes.io/projected/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a-kube-api-access-j6cfl\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.402617 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.438827 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.446887 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.462872 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:21:10 crc kubenswrapper[4837]: E1014 13:21:10.463426 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" containerName="nova-scheduler-scheduler" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463449 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" containerName="nova-scheduler-scheduler" Oct 14 13:21:10 crc kubenswrapper[4837]: E1014 13:21:10.463483 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400438cc-a9d6-4683-8eeb-df32774fc5a5" containerName="nova-manage" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463493 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="400438cc-a9d6-4683-8eeb-df32774fc5a5" containerName="nova-manage" Oct 14 13:21:10 crc kubenswrapper[4837]: E1014 13:21:10.463508 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerName="init" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463521 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerName="init" Oct 14 13:21:10 crc kubenswrapper[4837]: E1014 13:21:10.463535 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerName="dnsmasq-dns" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463543 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerName="dnsmasq-dns" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463806 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" containerName="nova-scheduler-scheduler" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463835 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd5cf86-df02-4983-86f2-430028f1d02d" containerName="dnsmasq-dns" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.463857 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="400438cc-a9d6-4683-8eeb-df32774fc5a5" containerName="nova-manage" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.464648 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.467379 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.487852 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.538405 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fs88\" (UniqueName: \"kubernetes.io/projected/ebe0b001-1902-4166-a8a3-b3d0c54139f4-kube-api-access-4fs88\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.538602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe0b001-1902-4166-a8a3-b3d0c54139f4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.538653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe0b001-1902-4166-a8a3-b3d0c54139f4-config-data\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.640900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fs88\" (UniqueName: \"kubernetes.io/projected/ebe0b001-1902-4166-a8a3-b3d0c54139f4-kube-api-access-4fs88\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.640963 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe0b001-1902-4166-a8a3-b3d0c54139f4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.640986 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe0b001-1902-4166-a8a3-b3d0c54139f4-config-data\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.652844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe0b001-1902-4166-a8a3-b3d0c54139f4-config-data\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.652908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe0b001-1902-4166-a8a3-b3d0c54139f4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.659445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fs88\" (UniqueName: \"kubernetes.io/projected/ebe0b001-1902-4166-a8a3-b3d0c54139f4-kube-api-access-4fs88\") pod \"nova-scheduler-0\" (UID: \"ebe0b001-1902-4166-a8a3-b3d0c54139f4\") " pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.737922 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:36732->10.217.0.193:8775: read: connection reset by peer" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.737922 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:36748->10.217.0.193:8775: read: connection reset by peer" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.782421 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:21:10 crc kubenswrapper[4837]: I1014 13:21:10.810890 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a" path="/var/lib/kubelet/pods/bc4208cf-d8d7-441b-aa2a-49cf6b1cc11a/volumes" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.231994 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.309010 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.357472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jh9p\" (UniqueName: \"kubernetes.io/projected/90a4e22d-dd23-427e-adbe-06cc729b517d-kube-api-access-8jh9p\") pod \"90a4e22d-dd23-427e-adbe-06cc729b517d\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.357596 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-nova-metadata-tls-certs\") pod \"90a4e22d-dd23-427e-adbe-06cc729b517d\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.357646 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-config-data\") pod \"90a4e22d-dd23-427e-adbe-06cc729b517d\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.357681 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-combined-ca-bundle\") pod \"90a4e22d-dd23-427e-adbe-06cc729b517d\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.357740 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4e22d-dd23-427e-adbe-06cc729b517d-logs\") pod \"90a4e22d-dd23-427e-adbe-06cc729b517d\" (UID: \"90a4e22d-dd23-427e-adbe-06cc729b517d\") " Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.358250 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a4e22d-dd23-427e-adbe-06cc729b517d-logs" (OuterVolumeSpecName: "logs") pod "90a4e22d-dd23-427e-adbe-06cc729b517d" (UID: "90a4e22d-dd23-427e-adbe-06cc729b517d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.363375 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a4e22d-dd23-427e-adbe-06cc729b517d-kube-api-access-8jh9p" (OuterVolumeSpecName: "kube-api-access-8jh9p") pod "90a4e22d-dd23-427e-adbe-06cc729b517d" (UID: "90a4e22d-dd23-427e-adbe-06cc729b517d"). InnerVolumeSpecName "kube-api-access-8jh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.391055 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-config-data" (OuterVolumeSpecName: "config-data") pod "90a4e22d-dd23-427e-adbe-06cc729b517d" (UID: "90a4e22d-dd23-427e-adbe-06cc729b517d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.391191 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90a4e22d-dd23-427e-adbe-06cc729b517d" (UID: "90a4e22d-dd23-427e-adbe-06cc729b517d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.414010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ebe0b001-1902-4166-a8a3-b3d0c54139f4","Type":"ContainerStarted","Data":"2f2c8d61daf75051abf7076792449545e96ea965f10f33fee3df1f8c31163f75"} Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.416295 4837 generic.go:334] "Generic (PLEG): container finished" podID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerID="51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e" exitCode=0 Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.416321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90a4e22d-dd23-427e-adbe-06cc729b517d","Type":"ContainerDied","Data":"51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e"} Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.416338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90a4e22d-dd23-427e-adbe-06cc729b517d","Type":"ContainerDied","Data":"84441172a1c58f2dcb3519e1a19e058956b101a0e6ebc7487b375c1e90159e24"} Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.416352 4837 scope.go:117] "RemoveContainer" containerID="51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.416482 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.430097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "90a4e22d-dd23-427e-adbe-06cc729b517d" (UID: "90a4e22d-dd23-427e-adbe-06cc729b517d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.438378 4837 scope.go:117] "RemoveContainer" containerID="91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.455610 4837 scope.go:117] "RemoveContainer" containerID="51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e" Oct 14 13:21:11 crc kubenswrapper[4837]: E1014 13:21:11.456098 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e\": container with ID starting with 51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e not found: ID does not exist" containerID="51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.456139 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e"} err="failed to get container status \"51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e\": rpc error: code = NotFound desc = could not find container \"51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e\": container with ID starting with 51501cdbad5989389aa2a554a953b10488d7616852c7e2776923500337d8b19e not found: ID does not exist" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.456179 4837 scope.go:117] "RemoveContainer" containerID="91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2" Oct 14 13:21:11 crc kubenswrapper[4837]: E1014 13:21:11.456647 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2\": container with ID starting with 91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2 not found: ID does not exist" containerID="91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.456691 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2"} err="failed to get container status \"91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2\": rpc error: code = NotFound desc = could not find container \"91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2\": container with ID starting with 91d4ab697b6df548013080bae46c8c890967252d318aef9772955d2afb1649c2 not found: ID does not exist" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.459870 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4e22d-dd23-427e-adbe-06cc729b517d-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.459903 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jh9p\" (UniqueName: \"kubernetes.io/projected/90a4e22d-dd23-427e-adbe-06cc729b517d-kube-api-access-8jh9p\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.459917 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.459931 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.459942 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4e22d-dd23-427e-adbe-06cc729b517d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.760337 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.777906 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.789260 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:21:11 crc kubenswrapper[4837]: E1014 13:21:11.789675 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-metadata" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.789730 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-metadata" Oct 14 13:21:11 crc kubenswrapper[4837]: E1014 13:21:11.789747 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-log" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.789753 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-log" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.789940 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-metadata" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.789961 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" containerName="nova-metadata-log" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.791640 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.796043 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.797325 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.797464 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.866808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.866860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.866993 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-config-data\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.867064 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db56d67-c528-47cc-8569-d9636ebd2667-logs\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.867114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz69b\" (UniqueName: \"kubernetes.io/projected/2db56d67-c528-47cc-8569-d9636ebd2667-kube-api-access-gz69b\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.969591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.969748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.969859 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-config-data\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.969914 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db56d67-c528-47cc-8569-d9636ebd2667-logs\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.969964 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz69b\" (UniqueName: \"kubernetes.io/projected/2db56d67-c528-47cc-8569-d9636ebd2667-kube-api-access-gz69b\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.970718 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db56d67-c528-47cc-8569-d9636ebd2667-logs\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.975133 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.975143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.976209 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db56d67-c528-47cc-8569-d9636ebd2667-config-data\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:11 crc kubenswrapper[4837]: I1014 13:21:11.988445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz69b\" (UniqueName: \"kubernetes.io/projected/2db56d67-c528-47cc-8569-d9636ebd2667-kube-api-access-gz69b\") pod \"nova-metadata-0\" (UID: \"2db56d67-c528-47cc-8569-d9636ebd2667\") " pod="openstack/nova-metadata-0" Oct 14 13:21:12 crc kubenswrapper[4837]: I1014 13:21:12.108647 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:12.429375 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ebe0b001-1902-4166-a8a3-b3d0c54139f4","Type":"ContainerStarted","Data":"0bbcea6641c6b77bddd547b2c37cd307bbb6483c5354c23ef3bc8c48c0c50caa"} Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:12.456140 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.456124645 podStartE2EDuration="2.456124645s" podCreationTimestamp="2025-10-14 13:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:12.448054698 +0000 UTC m=+1210.365054511" watchObservedRunningTime="2025-10-14 13:21:12.456124645 +0000 UTC m=+1210.373124458" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:12.562068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:12.804822 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a4e22d-dd23-427e-adbe-06cc729b517d" path="/var/lib/kubelet/pods/90a4e22d-dd23-427e-adbe-06cc729b517d/volumes" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:13.443299 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2db56d67-c528-47cc-8569-d9636ebd2667","Type":"ContainerStarted","Data":"a5665f0b5f69eb1653548e41f333e60ae4991c66cb6b0330922d2c9f97fb794d"} Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:13.443585 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2db56d67-c528-47cc-8569-d9636ebd2667","Type":"ContainerStarted","Data":"633830486a4358c36f1bbbef37b743071b8ff8888cbf1976efd22f6de6e19cfc"} Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:13.445667 4837 generic.go:334] "Generic (PLEG): container finished" podID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerID="cedd08263862b957c94afc78594a626fef41b1663872e6c90cd89a98fb324bde" exitCode=0 Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:13.445756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93541fa5-f5c5-459f-9419-7ff360db2ae6","Type":"ContainerDied","Data":"cedd08263862b957c94afc78594a626fef41b1663872e6c90cd89a98fb324bde"} Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.422357 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.457741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2db56d67-c528-47cc-8569-d9636ebd2667","Type":"ContainerStarted","Data":"f3669f1149da0fd52ac088cb20c9c77b171f5901a980a0730029312b856e8452"} Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.474855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93541fa5-f5c5-459f-9419-7ff360db2ae6","Type":"ContainerDied","Data":"2a3b32c365eae006bc31fd65558eff85658cd2731cfd725b115cd755d155e671"} Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.474911 4837 scope.go:117] "RemoveContainer" containerID="cedd08263862b957c94afc78594a626fef41b1663872e6c90cd89a98fb324bde" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.475061 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.491754 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.491732419 podStartE2EDuration="3.491732419s" podCreationTimestamp="2025-10-14 13:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:14.482854431 +0000 UTC m=+1212.399854264" watchObservedRunningTime="2025-10-14 13:21:14.491732419 +0000 UTC m=+1212.408732232" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.508539 4837 scope.go:117] "RemoveContainer" containerID="30daba50566b7493b19d2cbd96c67111db75a0982b4c280911425493678cf7ac" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.516847 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bd8b\" (UniqueName: \"kubernetes.io/projected/93541fa5-f5c5-459f-9419-7ff360db2ae6-kube-api-access-2bd8b\") pod \"93541fa5-f5c5-459f-9419-7ff360db2ae6\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.517011 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-combined-ca-bundle\") pod \"93541fa5-f5c5-459f-9419-7ff360db2ae6\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.517076 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-public-tls-certs\") pod \"93541fa5-f5c5-459f-9419-7ff360db2ae6\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.517144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-config-data\") pod \"93541fa5-f5c5-459f-9419-7ff360db2ae6\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.517212 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93541fa5-f5c5-459f-9419-7ff360db2ae6-logs\") pod \"93541fa5-f5c5-459f-9419-7ff360db2ae6\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.517272 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-internal-tls-certs\") pod \"93541fa5-f5c5-459f-9419-7ff360db2ae6\" (UID: \"93541fa5-f5c5-459f-9419-7ff360db2ae6\") " Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.517941 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93541fa5-f5c5-459f-9419-7ff360db2ae6-logs" (OuterVolumeSpecName: "logs") pod "93541fa5-f5c5-459f-9419-7ff360db2ae6" (UID: "93541fa5-f5c5-459f-9419-7ff360db2ae6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.535610 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93541fa5-f5c5-459f-9419-7ff360db2ae6-kube-api-access-2bd8b" (OuterVolumeSpecName: "kube-api-access-2bd8b") pod "93541fa5-f5c5-459f-9419-7ff360db2ae6" (UID: "93541fa5-f5c5-459f-9419-7ff360db2ae6"). InnerVolumeSpecName "kube-api-access-2bd8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.548677 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93541fa5-f5c5-459f-9419-7ff360db2ae6" (UID: "93541fa5-f5c5-459f-9419-7ff360db2ae6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.555870 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-config-data" (OuterVolumeSpecName: "config-data") pod "93541fa5-f5c5-459f-9419-7ff360db2ae6" (UID: "93541fa5-f5c5-459f-9419-7ff360db2ae6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.571069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "93541fa5-f5c5-459f-9419-7ff360db2ae6" (UID: "93541fa5-f5c5-459f-9419-7ff360db2ae6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.577385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "93541fa5-f5c5-459f-9419-7ff360db2ae6" (UID: "93541fa5-f5c5-459f-9419-7ff360db2ae6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.619825 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.619855 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.619864 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93541fa5-f5c5-459f-9419-7ff360db2ae6-logs\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.619873 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.619882 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bd8b\" (UniqueName: \"kubernetes.io/projected/93541fa5-f5c5-459f-9419-7ff360db2ae6-kube-api-access-2bd8b\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.619890 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93541fa5-f5c5-459f-9419-7ff360db2ae6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.813363 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.826823 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.836335 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 14 13:21:14 crc kubenswrapper[4837]: E1014 13:21:14.836807 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-log" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.836831 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-log" Oct 14 13:21:14 crc kubenswrapper[4837]: E1014 13:21:14.836876 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-api" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.836885 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-api" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.837111 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-api" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.837142 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" containerName="nova-api-log" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.838355 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.843834 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.844001 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.844100 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.878247 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.927806 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9lfg\" (UniqueName: \"kubernetes.io/projected/f9a329d7-874c-4b64-b23e-10463d345068-kube-api-access-n9lfg\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.927867 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.927890 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9a329d7-874c-4b64-b23e-10463d345068-logs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.927955 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-config-data\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.927970 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:14 crc kubenswrapper[4837]: I1014 13:21:14.928013 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.030264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-config-data\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.030316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.030380 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.030476 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9lfg\" (UniqueName: \"kubernetes.io/projected/f9a329d7-874c-4b64-b23e-10463d345068-kube-api-access-n9lfg\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.030520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.030539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9a329d7-874c-4b64-b23e-10463d345068-logs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.031208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9a329d7-874c-4b64-b23e-10463d345068-logs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.034725 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.035260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.035314 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-config-data\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.038786 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a329d7-874c-4b64-b23e-10463d345068-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.046572 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9lfg\" (UniqueName: \"kubernetes.io/projected/f9a329d7-874c-4b64-b23e-10463d345068-kube-api-access-n9lfg\") pod \"nova-api-0\" (UID: \"f9a329d7-874c-4b64-b23e-10463d345068\") " pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.164932 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.633323 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:21:15 crc kubenswrapper[4837]: I1014 13:21:15.783400 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 14 13:21:16 crc kubenswrapper[4837]: I1014 13:21:16.498800 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9a329d7-874c-4b64-b23e-10463d345068","Type":"ContainerStarted","Data":"b363f2680983f812f987942526b4fd100486b33f0fe656ef99af12ea6c33511e"} Oct 14 13:21:16 crc kubenswrapper[4837]: I1014 13:21:16.500285 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9a329d7-874c-4b64-b23e-10463d345068","Type":"ContainerStarted","Data":"720879347a40f32604715eb9987ede107ed0758c29d925a915b81e0653c0aadb"} Oct 14 13:21:16 crc kubenswrapper[4837]: I1014 13:21:16.500309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9a329d7-874c-4b64-b23e-10463d345068","Type":"ContainerStarted","Data":"c79faebf87b80ebdb28a43f2c2ae5cdd21125dc34a9ab5e42bf66063e38765ba"} Oct 14 13:21:16 crc kubenswrapper[4837]: I1014 13:21:16.522461 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.522436241 podStartE2EDuration="2.522436241s" podCreationTimestamp="2025-10-14 13:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:16.515461235 +0000 UTC m=+1214.432461068" watchObservedRunningTime="2025-10-14 13:21:16.522436241 +0000 UTC m=+1214.439436054" Oct 14 13:21:16 crc kubenswrapper[4837]: I1014 13:21:16.798585 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93541fa5-f5c5-459f-9419-7ff360db2ae6" path="/var/lib/kubelet/pods/93541fa5-f5c5-459f-9419-7ff360db2ae6/volumes" Oct 14 13:21:17 crc kubenswrapper[4837]: I1014 13:21:17.109021 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:21:17 crc kubenswrapper[4837]: I1014 13:21:17.109076 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:21:20 crc kubenswrapper[4837]: I1014 13:21:20.782855 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 14 13:21:20 crc kubenswrapper[4837]: I1014 13:21:20.873521 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 14 13:21:21 crc kubenswrapper[4837]: I1014 13:21:21.604261 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 14 13:21:22 crc kubenswrapper[4837]: I1014 13:21:22.109633 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:21:22 crc kubenswrapper[4837]: I1014 13:21:22.109698 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:21:23 crc kubenswrapper[4837]: I1014 13:21:23.123371 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2db56d67-c528-47cc-8569-d9636ebd2667" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:21:23 crc kubenswrapper[4837]: I1014 13:21:23.123371 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2db56d67-c528-47cc-8569-d9636ebd2667" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:21:25 crc kubenswrapper[4837]: I1014 13:21:25.166563 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:21:25 crc kubenswrapper[4837]: I1014 13:21:25.166654 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:21:26 crc kubenswrapper[4837]: I1014 13:21:26.181645 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f9a329d7-874c-4b64-b23e-10463d345068" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:21:26 crc kubenswrapper[4837]: I1014 13:21:26.181656 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f9a329d7-874c-4b64-b23e-10463d345068" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:21:27 crc kubenswrapper[4837]: I1014 13:21:27.750812 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:21:32 crc kubenswrapper[4837]: I1014 13:21:32.114334 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:21:32 crc kubenswrapper[4837]: I1014 13:21:32.116437 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:21:32 crc kubenswrapper[4837]: I1014 13:21:32.120523 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:21:32 crc kubenswrapper[4837]: I1014 13:21:32.120909 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:21:35 crc kubenswrapper[4837]: I1014 13:21:35.174597 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:21:35 crc kubenswrapper[4837]: I1014 13:21:35.175835 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:21:35 crc kubenswrapper[4837]: I1014 13:21:35.176723 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:21:35 crc kubenswrapper[4837]: I1014 13:21:35.184572 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:21:35 crc kubenswrapper[4837]: I1014 13:21:35.694420 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:21:35 crc kubenswrapper[4837]: I1014 13:21:35.704609 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:21:44 crc kubenswrapper[4837]: I1014 13:21:44.422783 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:21:45 crc kubenswrapper[4837]: I1014 13:21:45.424606 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:21:48 crc kubenswrapper[4837]: I1014 13:21:48.363129 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="rabbitmq" containerID="cri-o://987dc060a05aa5aefc79f1046096b30535fadb959f3f2adb2f63161a8f110645" gracePeriod=604797 Oct 14 13:21:49 crc kubenswrapper[4837]: I1014 13:21:49.011585 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 14 13:21:49 crc kubenswrapper[4837]: I1014 13:21:49.292683 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerName="rabbitmq" containerID="cri-o://0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe" gracePeriod=604797 Oct 14 13:21:54 crc kubenswrapper[4837]: I1014 13:21:54.874361 4837 generic.go:334] "Generic (PLEG): container finished" podID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerID="987dc060a05aa5aefc79f1046096b30535fadb959f3f2adb2f63161a8f110645" exitCode=0 Oct 14 13:21:54 crc kubenswrapper[4837]: I1014 13:21:54.874448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0fcd80e-9aec-4608-bfc5-653c443d1849","Type":"ContainerDied","Data":"987dc060a05aa5aefc79f1046096b30535fadb959f3f2adb2f63161a8f110645"} Oct 14 13:21:54 crc kubenswrapper[4837]: I1014 13:21:54.959077 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125007 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-config-data\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125096 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s4tq\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-kube-api-access-9s4tq\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125151 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-confd\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0fcd80e-9aec-4608-bfc5-653c443d1849-pod-info\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125307 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-plugins\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125327 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-plugins-conf\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125349 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0fcd80e-9aec-4608-bfc5-653c443d1849-erlang-cookie-secret\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125367 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-server-conf\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-erlang-cookie\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.125443 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-tls\") pod \"e0fcd80e-9aec-4608-bfc5-653c443d1849\" (UID: \"e0fcd80e-9aec-4608-bfc5-653c443d1849\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.126073 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.126196 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.126340 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.131255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-kube-api-access-9s4tq" (OuterVolumeSpecName: "kube-api-access-9s4tq") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "kube-api-access-9s4tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.133673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e0fcd80e-9aec-4608-bfc5-653c443d1849-pod-info" (OuterVolumeSpecName: "pod-info") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.134296 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.134513 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0fcd80e-9aec-4608-bfc5-653c443d1849-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.149072 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.191502 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-config-data" (OuterVolumeSpecName: "config-data") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.211348 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-server-conf" (OuterVolumeSpecName: "server-conf") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227489 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227523 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227535 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0fcd80e-9aec-4608-bfc5-653c443d1849-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227546 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-server-conf\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227576 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227589 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227602 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227613 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0fcd80e-9aec-4608-bfc5-653c443d1849-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227623 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s4tq\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-kube-api-access-9s4tq\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.227634 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0fcd80e-9aec-4608-bfc5-653c443d1849-pod-info\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.254216 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.276756 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e0fcd80e-9aec-4608-bfc5-653c443d1849" (UID: "e0fcd80e-9aec-4608-bfc5-653c443d1849"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.329068 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.329112 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0fcd80e-9aec-4608-bfc5-653c443d1849-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.834002 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.901712 4837 generic.go:334] "Generic (PLEG): container finished" podID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerID="0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe" exitCode=0 Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.901774 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.901840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6877e694-37ca-4cd4-ba01-3101d4f7ade4","Type":"ContainerDied","Data":"0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe"} Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.901921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6877e694-37ca-4cd4-ba01-3101d4f7ade4","Type":"ContainerDied","Data":"22f59a257e6fa9c7ee430af21708087e95418104397f897cabe56d52fa9fcdc7"} Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.901951 4837 scope.go:117] "RemoveContainer" containerID="0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.907556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0fcd80e-9aec-4608-bfc5-653c443d1849","Type":"ContainerDied","Data":"a95f1d8e90564fe92217504f036261379d63cf6d56ceb6f89d2522a4543a4d63"} Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.907616 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.933181 4837 scope.go:117] "RemoveContainer" containerID="46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-tls\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942540 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-config-data\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942591 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-plugins-conf\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942625 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-server-conf\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942662 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnfnk\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-kube-api-access-nnfnk\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942688 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6877e694-37ca-4cd4-ba01-3101d4f7ade4-erlang-cookie-secret\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942810 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-confd\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-erlang-cookie\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942871 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6877e694-37ca-4cd4-ba01-3101d4f7ade4-pod-info\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.942917 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-plugins\") pod \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\" (UID: \"6877e694-37ca-4cd4-ba01-3101d4f7ade4\") " Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.945733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.946804 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.948773 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.955351 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-kube-api-access-nnfnk" (OuterVolumeSpecName: "kube-api-access-nnfnk") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "kube-api-access-nnfnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.955483 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6877e694-37ca-4cd4-ba01-3101d4f7ade4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.955572 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6877e694-37ca-4cd4-ba01-3101d4f7ade4-pod-info" (OuterVolumeSpecName: "pod-info") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.957581 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.969709 4837 scope.go:117] "RemoveContainer" containerID="0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe" Oct 14 13:21:55 crc kubenswrapper[4837]: E1014 13:21:55.978503 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe\": container with ID starting with 0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe not found: ID does not exist" containerID="0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.978569 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe"} err="failed to get container status \"0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe\": rpc error: code = NotFound desc = could not find container \"0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe\": container with ID starting with 0cb666df906b6e1a6d1ae6d36309d40910f97f03d0df22850094113cdc7e7cfe not found: ID does not exist" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.978606 4837 scope.go:117] "RemoveContainer" containerID="46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce" Oct 14 13:21:55 crc kubenswrapper[4837]: E1014 13:21:55.980258 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce\": container with ID starting with 46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce not found: ID does not exist" containerID="46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.980322 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce"} err="failed to get container status \"46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce\": rpc error: code = NotFound desc = could not find container \"46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce\": container with ID starting with 46d2e9cc2eab88bd1622134e0b380eb0024a544fd47a00761c1b2f203cdc05ce not found: ID does not exist" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.980411 4837 scope.go:117] "RemoveContainer" containerID="987dc060a05aa5aefc79f1046096b30535fadb959f3f2adb2f63161a8f110645" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.980377 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:55 crc kubenswrapper[4837]: I1014 13:21:55.987181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-config-data" (OuterVolumeSpecName: "config-data") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.005417 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.014268 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.026829 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-server-conf" (OuterVolumeSpecName: "server-conf") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.028954 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: E1014 13:21:56.029309 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="rabbitmq" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.029324 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="rabbitmq" Oct 14 13:21:56 crc kubenswrapper[4837]: E1014 13:21:56.029338 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerName="rabbitmq" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.029344 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerName="rabbitmq" Oct 14 13:21:56 crc kubenswrapper[4837]: E1014 13:21:56.029359 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerName="setup-container" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.029366 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerName="setup-container" Oct 14 13:21:56 crc kubenswrapper[4837]: E1014 13:21:56.029385 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="setup-container" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.029391 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="setup-container" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.029587 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" containerName="rabbitmq" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.029599 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" containerName="rabbitmq" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.030464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.032640 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.033918 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.033994 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.034252 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.035381 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.035509 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.035899 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gstcl" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.043232 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059273 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059307 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059319 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059330 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6877e694-37ca-4cd4-ba01-3101d4f7ade4-server-conf\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059342 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnfnk\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-kube-api-access-nnfnk\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059379 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059395 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6877e694-37ca-4cd4-ba01-3101d4f7ade4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059406 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059418 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6877e694-37ca-4cd4-ba01-3101d4f7ade4-pod-info\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.059429 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.065138 4837 scope.go:117] "RemoveContainer" containerID="3f5e3ffe8a2b7184a62f038431da9b3670de00f0a32b5022a406da34cfe945a7" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.094450 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.100483 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6877e694-37ca-4cd4-ba01-3101d4f7ade4" (UID: "6877e694-37ca-4cd4-ba01-3101d4f7ade4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161511 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161541 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4kp\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-kube-api-access-br4kp\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161648 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161673 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4e21425-fc2a-487e-bb81-615828fd727f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161711 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4e21425-fc2a-487e-bb81-615828fd727f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161735 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161777 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161866 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161910 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161977 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.161992 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6877e694-37ca-4cd4-ba01-3101d4f7ade4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.240304 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.246542 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.268511 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270019 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270442 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br4kp\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-kube-api-access-br4kp\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270567 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4e21425-fc2a-487e-bb81-615828fd727f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270599 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4e21425-fc2a-487e-bb81-615828fd727f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270658 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.270772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.271245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.271361 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.271723 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.271779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.271963 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.272445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c4e21425-fc2a-487e-bb81-615828fd727f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.275417 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c4e21425-fc2a-487e-bb81-615828fd727f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.275969 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.276066 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.278470 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c4e21425-fc2a-487e-bb81-615828fd727f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.278705 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.283251 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.283464 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.283657 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-krn9k" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.285604 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.285685 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.286023 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.286267 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.310396 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4kp\" (UniqueName: \"kubernetes.io/projected/c4e21425-fc2a-487e-bb81-615828fd727f-kube-api-access-br4kp\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.318827 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.325929 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c4e21425-fc2a-487e-bb81-615828fd727f\") " pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.361434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511139 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511276 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511321 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511478 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511514 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9slbt\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-kube-api-access-9slbt\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.511603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.613883 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.613974 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.613996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9slbt\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-kube-api-access-9slbt\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614063 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614205 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614270 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.614862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.615203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.615585 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.616334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.616663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.617821 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.619561 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.619935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.624107 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.634237 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.639125 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9slbt\" (UniqueName: \"kubernetes.io/projected/4c7edbbd-c98f-4800-a4ae-49ea0de7f12d-kube-api-access-9slbt\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.660137 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.795969 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6877e694-37ca-4cd4-ba01-3101d4f7ade4" path="/var/lib/kubelet/pods/6877e694-37ca-4cd4-ba01-3101d4f7ade4/volumes" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.796999 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0fcd80e-9aec-4608-bfc5-653c443d1849" path="/var/lib/kubelet/pods/e0fcd80e-9aec-4608-bfc5-653c443d1849/volumes" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.867861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.909523 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:21:56 crc kubenswrapper[4837]: I1014 13:21:56.928618 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4e21425-fc2a-487e-bb81-615828fd727f","Type":"ContainerStarted","Data":"ea5c67cad138addf54f96b02ec59ccfea9c1010969fb770500e105e9214e9d19"} Oct 14 13:21:57 crc kubenswrapper[4837]: I1014 13:21:57.387408 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 14 13:21:57 crc kubenswrapper[4837]: I1014 13:21:57.943145 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d","Type":"ContainerStarted","Data":"9d2d43588a489cb5292246a6788329b0f1e4c1208f8f8bfccdc9c66396df77c8"} Oct 14 13:21:58 crc kubenswrapper[4837]: I1014 13:21:58.951565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4e21425-fc2a-487e-bb81-615828fd727f","Type":"ContainerStarted","Data":"2dd05e4b9f9ed093eb1ccb9cf8563a7ce20e044f83600c50b0a660881c122c31"} Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.104654 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-jwrh7"] Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.106139 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.116611 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.121541 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-jwrh7"] Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166357 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-svc\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166416 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166458 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-config\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166527 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166545 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzvg8\" (UniqueName: \"kubernetes.io/projected/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-kube-api-access-vzvg8\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.166647 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268200 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-svc\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-config\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.268474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzvg8\" (UniqueName: \"kubernetes.io/projected/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-kube-api-access-vzvg8\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.269236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-svc\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.269298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.269311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.269505 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-config\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.269603 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.269786 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.295405 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzvg8\" (UniqueName: \"kubernetes.io/projected/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-kube-api-access-vzvg8\") pod \"dnsmasq-dns-5576978c7c-jwrh7\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.438200 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.868032 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-jwrh7"] Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.962238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" event={"ID":"8b26c524-66c9-4e5c-9f45-983ee5a49e6f","Type":"ContainerStarted","Data":"ef9b4424935fbdd84017e6b5adf90a6b61a249153e15478a51135142e7c256d6"} Oct 14 13:21:59 crc kubenswrapper[4837]: I1014 13:21:59.965293 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d","Type":"ContainerStarted","Data":"6342b67acc3bba3564abf5508081eafb268b7de0faa4ac752ce81f5b8080ad47"} Oct 14 13:22:00 crc kubenswrapper[4837]: I1014 13:22:00.977605 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerID="54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897" exitCode=0 Oct 14 13:22:00 crc kubenswrapper[4837]: I1014 13:22:00.977659 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" event={"ID":"8b26c524-66c9-4e5c-9f45-983ee5a49e6f","Type":"ContainerDied","Data":"54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897"} Oct 14 13:22:01 crc kubenswrapper[4837]: I1014 13:22:01.988845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" event={"ID":"8b26c524-66c9-4e5c-9f45-983ee5a49e6f","Type":"ContainerStarted","Data":"70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a"} Oct 14 13:22:01 crc kubenswrapper[4837]: I1014 13:22:01.989211 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:22:02 crc kubenswrapper[4837]: I1014 13:22:02.012150 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" podStartSLOduration=3.012129351 podStartE2EDuration="3.012129351s" podCreationTimestamp="2025-10-14 13:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:02.007024185 +0000 UTC m=+1259.924023998" watchObservedRunningTime="2025-10-14 13:22:02.012129351 +0000 UTC m=+1259.929129154" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.439912 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.510251 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6hcqh"] Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.510785 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="dnsmasq-dns" containerID="cri-o://9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25" gracePeriod=10 Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.556241 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: connect: connection refused" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.685594 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-c696x"] Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.687945 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.693820 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-c696x"] Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769518 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj4jg\" (UniqueName: \"kubernetes.io/projected/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-kube-api-access-fj4jg\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769582 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.769633 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-config\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj4jg\" (UniqueName: \"kubernetes.io/projected/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-kube-api-access-fj4jg\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871636 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.871807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-config\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.873023 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.873545 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.874036 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-config\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.874246 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.874919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.875076 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:09 crc kubenswrapper[4837]: I1014 13:22:09.895065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj4jg\" (UniqueName: \"kubernetes.io/projected/4a6f65cd-fd19-4b6a-9dee-4ef117beb86f-kube-api-access-fj4jg\") pod \"dnsmasq-dns-8c6f6df99-c696x\" (UID: \"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f\") " pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.013489 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.016802 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.071830 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerID="9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25" exitCode=0 Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.071877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" event={"ID":"e35cc05c-bae2-4693-88ff-2e28dce40010","Type":"ContainerDied","Data":"9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25"} Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.071905 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" event={"ID":"e35cc05c-bae2-4693-88ff-2e28dce40010","Type":"ContainerDied","Data":"5acbeef248dbee38ff327852d6bbabf2c549f3994f3e52abbc736f8c8a3ed34f"} Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.071921 4837 scope.go:117] "RemoveContainer" containerID="9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.072040 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6hcqh" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.076020 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-sb\") pod \"e35cc05c-bae2-4693-88ff-2e28dce40010\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.076173 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-svc\") pod \"e35cc05c-bae2-4693-88ff-2e28dce40010\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.076219 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-config\") pod \"e35cc05c-bae2-4693-88ff-2e28dce40010\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.076282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-swift-storage-0\") pod \"e35cc05c-bae2-4693-88ff-2e28dce40010\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.077493 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-nb\") pod \"e35cc05c-bae2-4693-88ff-2e28dce40010\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.077641 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59pff\" (UniqueName: \"kubernetes.io/projected/e35cc05c-bae2-4693-88ff-2e28dce40010-kube-api-access-59pff\") pod \"e35cc05c-bae2-4693-88ff-2e28dce40010\" (UID: \"e35cc05c-bae2-4693-88ff-2e28dce40010\") " Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.081852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35cc05c-bae2-4693-88ff-2e28dce40010-kube-api-access-59pff" (OuterVolumeSpecName: "kube-api-access-59pff") pod "e35cc05c-bae2-4693-88ff-2e28dce40010" (UID: "e35cc05c-bae2-4693-88ff-2e28dce40010"). InnerVolumeSpecName "kube-api-access-59pff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.142928 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-config" (OuterVolumeSpecName: "config") pod "e35cc05c-bae2-4693-88ff-2e28dce40010" (UID: "e35cc05c-bae2-4693-88ff-2e28dce40010"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.147413 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e35cc05c-bae2-4693-88ff-2e28dce40010" (UID: "e35cc05c-bae2-4693-88ff-2e28dce40010"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.151588 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e35cc05c-bae2-4693-88ff-2e28dce40010" (UID: "e35cc05c-bae2-4693-88ff-2e28dce40010"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.155785 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e35cc05c-bae2-4693-88ff-2e28dce40010" (UID: "e35cc05c-bae2-4693-88ff-2e28dce40010"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.159624 4837 scope.go:117] "RemoveContainer" containerID="2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.164918 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e35cc05c-bae2-4693-88ff-2e28dce40010" (UID: "e35cc05c-bae2-4693-88ff-2e28dce40010"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.179526 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59pff\" (UniqueName: \"kubernetes.io/projected/e35cc05c-bae2-4693-88ff-2e28dce40010-kube-api-access-59pff\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.179556 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.179566 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.179576 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.179584 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.179593 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e35cc05c-bae2-4693-88ff-2e28dce40010-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.181026 4837 scope.go:117] "RemoveContainer" containerID="9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25" Oct 14 13:22:10 crc kubenswrapper[4837]: E1014 13:22:10.184887 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25\": container with ID starting with 9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25 not found: ID does not exist" containerID="9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.184937 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25"} err="failed to get container status \"9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25\": rpc error: code = NotFound desc = could not find container \"9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25\": container with ID starting with 9fbb1e85a63b2d7c5da79e92e7f2a4909703869e54f657ce2d4a35168e9eba25 not found: ID does not exist" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.184968 4837 scope.go:117] "RemoveContainer" containerID="2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3" Oct 14 13:22:10 crc kubenswrapper[4837]: E1014 13:22:10.185415 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3\": container with ID starting with 2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3 not found: ID does not exist" containerID="2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.185470 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3"} err="failed to get container status \"2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3\": rpc error: code = NotFound desc = could not find container \"2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3\": container with ID starting with 2da94abd8d29f7672163e69b3ba76e108ccabc8c6afc2a44f66e5eb2b7ccb6e3 not found: ID does not exist" Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.404494 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6hcqh"] Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.415019 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6hcqh"] Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.524338 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-c696x"] Oct 14 13:22:10 crc kubenswrapper[4837]: W1014 13:22:10.528601 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6f65cd_fd19_4b6a_9dee_4ef117beb86f.slice/crio-9fe55e799eb652e3917d567d1f3d6502a7b8c1af3e19de67732af4ca08f8943b WatchSource:0}: Error finding container 9fe55e799eb652e3917d567d1f3d6502a7b8c1af3e19de67732af4ca08f8943b: Status 404 returned error can't find the container with id 9fe55e799eb652e3917d567d1f3d6502a7b8c1af3e19de67732af4ca08f8943b Oct 14 13:22:10 crc kubenswrapper[4837]: I1014 13:22:10.796276 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" path="/var/lib/kubelet/pods/e35cc05c-bae2-4693-88ff-2e28dce40010/volumes" Oct 14 13:22:11 crc kubenswrapper[4837]: I1014 13:22:11.081564 4837 generic.go:334] "Generic (PLEG): container finished" podID="4a6f65cd-fd19-4b6a-9dee-4ef117beb86f" containerID="b35a5c8440c770863415b90c58b4115d0f78183e8b87b3d4bedc1f6702bc990a" exitCode=0 Oct 14 13:22:11 crc kubenswrapper[4837]: I1014 13:22:11.081629 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" event={"ID":"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f","Type":"ContainerDied","Data":"b35a5c8440c770863415b90c58b4115d0f78183e8b87b3d4bedc1f6702bc990a"} Oct 14 13:22:11 crc kubenswrapper[4837]: I1014 13:22:11.082053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" event={"ID":"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f","Type":"ContainerStarted","Data":"9fe55e799eb652e3917d567d1f3d6502a7b8c1af3e19de67732af4ca08f8943b"} Oct 14 13:22:11 crc kubenswrapper[4837]: I1014 13:22:11.141273 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:22:11 crc kubenswrapper[4837]: I1014 13:22:11.141339 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:22:12 crc kubenswrapper[4837]: I1014 13:22:12.094672 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" event={"ID":"4a6f65cd-fd19-4b6a-9dee-4ef117beb86f","Type":"ContainerStarted","Data":"16fc95bd90444cd91ccbb042586de1fb50d78967537bf1781d2fa9fb04ae7413"} Oct 14 13:22:12 crc kubenswrapper[4837]: I1014 13:22:12.094891 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:12 crc kubenswrapper[4837]: I1014 13:22:12.118736 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" podStartSLOduration=3.118717758 podStartE2EDuration="3.118717758s" podCreationTimestamp="2025-10-14 13:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:12.110041556 +0000 UTC m=+1270.027041379" watchObservedRunningTime="2025-10-14 13:22:12.118717758 +0000 UTC m=+1270.035717571" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.018382 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-c696x" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.106018 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-jwrh7"] Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.106513 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerName="dnsmasq-dns" containerID="cri-o://70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a" gracePeriod=10 Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.628617 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzvg8\" (UniqueName: \"kubernetes.io/projected/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-kube-api-access-vzvg8\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-svc\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682594 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-config\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682634 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-nb\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682714 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-openstack-edpm-ipam\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-swift-storage-0\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.682761 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-sb\") pod \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\" (UID: \"8b26c524-66c9-4e5c-9f45-983ee5a49e6f\") " Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.704394 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-kube-api-access-vzvg8" (OuterVolumeSpecName: "kube-api-access-vzvg8") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "kube-api-access-vzvg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.737114 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-config" (OuterVolumeSpecName: "config") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.737852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.740762 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.754255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.754810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.755909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b26c524-66c9-4e5c-9f45-983ee5a49e6f" (UID: "8b26c524-66c9-4e5c-9f45-983ee5a49e6f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.785585 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.785880 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.785940 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.786003 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzvg8\" (UniqueName: \"kubernetes.io/projected/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-kube-api-access-vzvg8\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.786107 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.786205 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:20 crc kubenswrapper[4837]: I1014 13:22:20.786283 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b26c524-66c9-4e5c-9f45-983ee5a49e6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.198911 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerID="70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a" exitCode=0 Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.198992 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" event={"ID":"8b26c524-66c9-4e5c-9f45-983ee5a49e6f","Type":"ContainerDied","Data":"70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a"} Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.199042 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" event={"ID":"8b26c524-66c9-4e5c-9f45-983ee5a49e6f","Type":"ContainerDied","Data":"ef9b4424935fbdd84017e6b5adf90a6b61a249153e15478a51135142e7c256d6"} Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.199088 4837 scope.go:117] "RemoveContainer" containerID="70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.199359 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-jwrh7" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.244558 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-jwrh7"] Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.258106 4837 scope.go:117] "RemoveContainer" containerID="54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.266542 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-jwrh7"] Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.307792 4837 scope.go:117] "RemoveContainer" containerID="70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a" Oct 14 13:22:21 crc kubenswrapper[4837]: E1014 13:22:21.308987 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a\": container with ID starting with 70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a not found: ID does not exist" containerID="70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.309028 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a"} err="failed to get container status \"70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a\": rpc error: code = NotFound desc = could not find container \"70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a\": container with ID starting with 70f472bd4e50fcc43460abc0949eca055b7821f2e3370a19d6f09d905691a01a not found: ID does not exist" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.309062 4837 scope.go:117] "RemoveContainer" containerID="54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897" Oct 14 13:22:21 crc kubenswrapper[4837]: E1014 13:22:21.309449 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897\": container with ID starting with 54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897 not found: ID does not exist" containerID="54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897" Oct 14 13:22:21 crc kubenswrapper[4837]: I1014 13:22:21.309523 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897"} err="failed to get container status \"54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897\": rpc error: code = NotFound desc = could not find container \"54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897\": container with ID starting with 54af293be35721db9b4361dd9fd868cfbea8f17c36c0677c226209eca21e2897 not found: ID does not exist" Oct 14 13:22:22 crc kubenswrapper[4837]: I1014 13:22:22.801185 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" path="/var/lib/kubelet/pods/8b26c524-66c9-4e5c-9f45-983ee5a49e6f/volumes" Oct 14 13:22:31 crc kubenswrapper[4837]: I1014 13:22:31.330898 4837 generic.go:334] "Generic (PLEG): container finished" podID="c4e21425-fc2a-487e-bb81-615828fd727f" containerID="2dd05e4b9f9ed093eb1ccb9cf8563a7ce20e044f83600c50b0a660881c122c31" exitCode=0 Oct 14 13:22:31 crc kubenswrapper[4837]: I1014 13:22:31.331064 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4e21425-fc2a-487e-bb81-615828fd727f","Type":"ContainerDied","Data":"2dd05e4b9f9ed093eb1ccb9cf8563a7ce20e044f83600c50b0a660881c122c31"} Oct 14 13:22:31 crc kubenswrapper[4837]: I1014 13:22:31.335781 4837 generic.go:334] "Generic (PLEG): container finished" podID="4c7edbbd-c98f-4800-a4ae-49ea0de7f12d" containerID="6342b67acc3bba3564abf5508081eafb268b7de0faa4ac752ce81f5b8080ad47" exitCode=0 Oct 14 13:22:31 crc kubenswrapper[4837]: I1014 13:22:31.335834 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d","Type":"ContainerDied","Data":"6342b67acc3bba3564abf5508081eafb268b7de0faa4ac752ce81f5b8080ad47"} Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.224470 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv"] Oct 14 13:22:32 crc kubenswrapper[4837]: E1014 13:22:32.225526 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="init" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.225545 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="init" Oct 14 13:22:32 crc kubenswrapper[4837]: E1014 13:22:32.225576 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerName="init" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.225584 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerName="init" Oct 14 13:22:32 crc kubenswrapper[4837]: E1014 13:22:32.225604 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerName="dnsmasq-dns" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.225613 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerName="dnsmasq-dns" Oct 14 13:22:32 crc kubenswrapper[4837]: E1014 13:22:32.225629 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="dnsmasq-dns" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.225636 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="dnsmasq-dns" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.225848 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35cc05c-bae2-4693-88ff-2e28dce40010" containerName="dnsmasq-dns" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.225867 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b26c524-66c9-4e5c-9f45-983ee5a49e6f" containerName="dnsmasq-dns" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.226795 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.229381 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.229554 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.229579 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.234637 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.237014 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv"] Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.319522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.319603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.319627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmr98\" (UniqueName: \"kubernetes.io/projected/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-kube-api-access-cmr98\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.319692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.362793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7edbbd-c98f-4800-a4ae-49ea0de7f12d","Type":"ContainerStarted","Data":"ee2f96831df0b6c82d69d2fac41ec0c3575f72113175f456238348063c61ea49"} Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.363019 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.366655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c4e21425-fc2a-487e-bb81-615828fd727f","Type":"ContainerStarted","Data":"5db10df2f2a5b9ceb549ef02dc1faa05d2654cefdf3303284e50c2ea3949599d"} Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.366864 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.387635 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.387619985 podStartE2EDuration="36.387619985s" podCreationTimestamp="2025-10-14 13:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:32.386529806 +0000 UTC m=+1290.303529639" watchObservedRunningTime="2025-10-14 13:22:32.387619985 +0000 UTC m=+1290.304619798" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.411723 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.411705351 podStartE2EDuration="37.411705351s" podCreationTimestamp="2025-10-14 13:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:32.410580481 +0000 UTC m=+1290.327580304" watchObservedRunningTime="2025-10-14 13:22:32.411705351 +0000 UTC m=+1290.328705164" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.421190 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.421274 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.421294 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmr98\" (UniqueName: \"kubernetes.io/projected/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-kube-api-access-cmr98\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.421364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.426357 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.426589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.430812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.438609 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmr98\" (UniqueName: \"kubernetes.io/projected/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-kube-api-access-cmr98\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:32 crc kubenswrapper[4837]: I1014 13:22:32.553968 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:33 crc kubenswrapper[4837]: W1014 13:22:33.123525 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbd0386_a3fe_4ad1_8b44_0945dd47a255.slice/crio-958f6b3dbca209ce61e47f217f566118f21185816b29957b657c211951afe8d3 WatchSource:0}: Error finding container 958f6b3dbca209ce61e47f217f566118f21185816b29957b657c211951afe8d3: Status 404 returned error can't find the container with id 958f6b3dbca209ce61e47f217f566118f21185816b29957b657c211951afe8d3 Oct 14 13:22:33 crc kubenswrapper[4837]: I1014 13:22:33.124343 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv"] Oct 14 13:22:33 crc kubenswrapper[4837]: I1014 13:22:33.374708 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" event={"ID":"8fbd0386-a3fe-4ad1-8b44-0945dd47a255","Type":"ContainerStarted","Data":"958f6b3dbca209ce61e47f217f566118f21185816b29957b657c211951afe8d3"} Oct 14 13:22:41 crc kubenswrapper[4837]: I1014 13:22:41.140122 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:22:41 crc kubenswrapper[4837]: I1014 13:22:41.140697 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:22:44 crc kubenswrapper[4837]: I1014 13:22:44.487360 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" event={"ID":"8fbd0386-a3fe-4ad1-8b44-0945dd47a255","Type":"ContainerStarted","Data":"2fdcdd0612b7d947c030a46ea68c13a109e09dfe14892c24115985f53824b058"} Oct 14 13:22:44 crc kubenswrapper[4837]: I1014 13:22:44.519687 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" podStartSLOduration=1.79081686 podStartE2EDuration="12.519661755s" podCreationTimestamp="2025-10-14 13:22:32 +0000 UTC" firstStartedPulling="2025-10-14 13:22:33.126401504 +0000 UTC m=+1291.043401317" lastFinishedPulling="2025-10-14 13:22:43.855246399 +0000 UTC m=+1301.772246212" observedRunningTime="2025-10-14 13:22:44.506673837 +0000 UTC m=+1302.423673670" watchObservedRunningTime="2025-10-14 13:22:44.519661755 +0000 UTC m=+1302.436661588" Oct 14 13:22:46 crc kubenswrapper[4837]: I1014 13:22:46.364316 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 14 13:22:46 crc kubenswrapper[4837]: I1014 13:22:46.913313 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 14 13:22:57 crc kubenswrapper[4837]: I1014 13:22:57.664642 4837 generic.go:334] "Generic (PLEG): container finished" podID="8fbd0386-a3fe-4ad1-8b44-0945dd47a255" containerID="2fdcdd0612b7d947c030a46ea68c13a109e09dfe14892c24115985f53824b058" exitCode=0 Oct 14 13:22:57 crc kubenswrapper[4837]: I1014 13:22:57.664757 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" event={"ID":"8fbd0386-a3fe-4ad1-8b44-0945dd47a255","Type":"ContainerDied","Data":"2fdcdd0612b7d947c030a46ea68c13a109e09dfe14892c24115985f53824b058"} Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.178787 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.278523 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmr98\" (UniqueName: \"kubernetes.io/projected/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-kube-api-access-cmr98\") pod \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.278570 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-inventory\") pod \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.278661 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-repo-setup-combined-ca-bundle\") pod \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.278689 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-ssh-key\") pod \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\" (UID: \"8fbd0386-a3fe-4ad1-8b44-0945dd47a255\") " Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.284952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8fbd0386-a3fe-4ad1-8b44-0945dd47a255" (UID: "8fbd0386-a3fe-4ad1-8b44-0945dd47a255"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.285414 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-kube-api-access-cmr98" (OuterVolumeSpecName: "kube-api-access-cmr98") pod "8fbd0386-a3fe-4ad1-8b44-0945dd47a255" (UID: "8fbd0386-a3fe-4ad1-8b44-0945dd47a255"). InnerVolumeSpecName "kube-api-access-cmr98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.314177 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-inventory" (OuterVolumeSpecName: "inventory") pod "8fbd0386-a3fe-4ad1-8b44-0945dd47a255" (UID: "8fbd0386-a3fe-4ad1-8b44-0945dd47a255"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.315538 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8fbd0386-a3fe-4ad1-8b44-0945dd47a255" (UID: "8fbd0386-a3fe-4ad1-8b44-0945dd47a255"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.381573 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmr98\" (UniqueName: \"kubernetes.io/projected/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-kube-api-access-cmr98\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.381626 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.381646 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.381662 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fbd0386-a3fe-4ad1-8b44-0945dd47a255-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.688860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" event={"ID":"8fbd0386-a3fe-4ad1-8b44-0945dd47a255","Type":"ContainerDied","Data":"958f6b3dbca209ce61e47f217f566118f21185816b29957b657c211951afe8d3"} Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.688905 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="958f6b3dbca209ce61e47f217f566118f21185816b29957b657c211951afe8d3" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.688975 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.830202 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq"] Oct 14 13:22:59 crc kubenswrapper[4837]: E1014 13:22:59.830736 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbd0386-a3fe-4ad1-8b44-0945dd47a255" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.830765 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbd0386-a3fe-4ad1-8b44-0945dd47a255" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.831063 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbd0386-a3fe-4ad1-8b44-0945dd47a255" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.831964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.834462 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.834501 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.834469 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.835045 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.842893 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq"] Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.890546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd69z\" (UniqueName: \"kubernetes.io/projected/d5e9e4de-2bda-45cb-a580-b89e8dee024e-kube-api-access-rd69z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.891784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.893494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.995732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.996330 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd69z\" (UniqueName: \"kubernetes.io/projected/d5e9e4de-2bda-45cb-a580-b89e8dee024e-kube-api-access-rd69z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.996602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:22:59 crc kubenswrapper[4837]: I1014 13:22:59.999882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:23:00 crc kubenswrapper[4837]: I1014 13:23:00.004498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:23:00 crc kubenswrapper[4837]: I1014 13:23:00.015401 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd69z\" (UniqueName: \"kubernetes.io/projected/d5e9e4de-2bda-45cb-a580-b89e8dee024e-kube-api-access-rd69z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ggmtq\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:23:00 crc kubenswrapper[4837]: I1014 13:23:00.157957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:23:00 crc kubenswrapper[4837]: I1014 13:23:00.781144 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq"] Oct 14 13:23:01 crc kubenswrapper[4837]: I1014 13:23:01.707003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" event={"ID":"d5e9e4de-2bda-45cb-a580-b89e8dee024e","Type":"ContainerStarted","Data":"9592819eeb7ec3a211474807761af8c5d32f54e15f0b792be3c4fd28c24246c3"} Oct 14 13:23:01 crc kubenswrapper[4837]: I1014 13:23:01.707398 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" event={"ID":"d5e9e4de-2bda-45cb-a580-b89e8dee024e","Type":"ContainerStarted","Data":"cb3b9cae383bfa4716a820c00523b56b80021df4cc33663bb578e7631b4f3b16"} Oct 14 13:23:01 crc kubenswrapper[4837]: I1014 13:23:01.726015 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" podStartSLOduration=2.14913309 podStartE2EDuration="2.72597066s" podCreationTimestamp="2025-10-14 13:22:59 +0000 UTC" firstStartedPulling="2025-10-14 13:23:00.78662545 +0000 UTC m=+1318.703625263" lastFinishedPulling="2025-10-14 13:23:01.363463 +0000 UTC m=+1319.280462833" observedRunningTime="2025-10-14 13:23:01.724292215 +0000 UTC m=+1319.641292028" watchObservedRunningTime="2025-10-14 13:23:01.72597066 +0000 UTC m=+1319.642970473" Oct 14 13:23:04 crc kubenswrapper[4837]: I1014 13:23:04.744697 4837 generic.go:334] "Generic (PLEG): container finished" podID="d5e9e4de-2bda-45cb-a580-b89e8dee024e" containerID="9592819eeb7ec3a211474807761af8c5d32f54e15f0b792be3c4fd28c24246c3" exitCode=0 Oct 14 13:23:04 crc kubenswrapper[4837]: I1014 13:23:04.744825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" event={"ID":"d5e9e4de-2bda-45cb-a580-b89e8dee024e","Type":"ContainerDied","Data":"9592819eeb7ec3a211474807761af8c5d32f54e15f0b792be3c4fd28c24246c3"} Oct 14 13:23:04 crc kubenswrapper[4837]: I1014 13:23:04.992359 4837 scope.go:117] "RemoveContainer" containerID="4077b23ba30ca10ac66ee37b13c21428591f6fb4dd2eef7698b93fee56310245" Oct 14 13:23:05 crc kubenswrapper[4837]: I1014 13:23:05.020017 4837 scope.go:117] "RemoveContainer" containerID="91955c4a91227dbea9570a993160a537c157f36589ea153fd738e9fd0d017547" Oct 14 13:23:05 crc kubenswrapper[4837]: I1014 13:23:05.085216 4837 scope.go:117] "RemoveContainer" containerID="4ae2696762c4caa0ad09d2879e82fa26cff9e32993154d92464c425658467580" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.191541 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.192823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-inventory\") pod \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.235342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-inventory" (OuterVolumeSpecName: "inventory") pod "d5e9e4de-2bda-45cb-a580-b89e8dee024e" (UID: "d5e9e4de-2bda-45cb-a580-b89e8dee024e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.294286 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd69z\" (UniqueName: \"kubernetes.io/projected/d5e9e4de-2bda-45cb-a580-b89e8dee024e-kube-api-access-rd69z\") pod \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.294369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-ssh-key\") pod \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\" (UID: \"d5e9e4de-2bda-45cb-a580-b89e8dee024e\") " Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.294651 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.298718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e9e4de-2bda-45cb-a580-b89e8dee024e-kube-api-access-rd69z" (OuterVolumeSpecName: "kube-api-access-rd69z") pod "d5e9e4de-2bda-45cb-a580-b89e8dee024e" (UID: "d5e9e4de-2bda-45cb-a580-b89e8dee024e"). InnerVolumeSpecName "kube-api-access-rd69z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.318552 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5e9e4de-2bda-45cb-a580-b89e8dee024e" (UID: "d5e9e4de-2bda-45cb-a580-b89e8dee024e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.395785 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd69z\" (UniqueName: \"kubernetes.io/projected/d5e9e4de-2bda-45cb-a580-b89e8dee024e-kube-api-access-rd69z\") on node \"crc\" DevicePath \"\"" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.395824 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5e9e4de-2bda-45cb-a580-b89e8dee024e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.763655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" event={"ID":"d5e9e4de-2bda-45cb-a580-b89e8dee024e","Type":"ContainerDied","Data":"cb3b9cae383bfa4716a820c00523b56b80021df4cc33663bb578e7631b4f3b16"} Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.763699 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3b9cae383bfa4716a820c00523b56b80021df4cc33663bb578e7631b4f3b16" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.763732 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ggmtq" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.857235 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b"] Oct 14 13:23:06 crc kubenswrapper[4837]: E1014 13:23:06.857767 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e9e4de-2bda-45cb-a580-b89e8dee024e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.857817 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e9e4de-2bda-45cb-a580-b89e8dee024e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.858305 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e9e4de-2bda-45cb-a580-b89e8dee024e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.859330 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.861311 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.861489 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.862108 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.865637 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.867047 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b"] Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.911895 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.912014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqztm\" (UniqueName: \"kubernetes.io/projected/6217fcbf-8651-4d63-b670-71de72f5feed-kube-api-access-cqztm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.912055 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:06 crc kubenswrapper[4837]: I1014 13:23:06.912188 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.013415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.013494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.013560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqztm\" (UniqueName: \"kubernetes.io/projected/6217fcbf-8651-4d63-b670-71de72f5feed-kube-api-access-cqztm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.013602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.018427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.018619 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.020245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.030866 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqztm\" (UniqueName: \"kubernetes.io/projected/6217fcbf-8651-4d63-b670-71de72f5feed-kube-api-access-cqztm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.178840 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.737941 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b"] Oct 14 13:23:07 crc kubenswrapper[4837]: W1014 13:23:07.745382 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6217fcbf_8651_4d63_b670_71de72f5feed.slice/crio-2220175928b61e18456b9e5168e9afcc1e5a84b34fdd8cd10cd331b6bfa1e167 WatchSource:0}: Error finding container 2220175928b61e18456b9e5168e9afcc1e5a84b34fdd8cd10cd331b6bfa1e167: Status 404 returned error can't find the container with id 2220175928b61e18456b9e5168e9afcc1e5a84b34fdd8cd10cd331b6bfa1e167 Oct 14 13:23:07 crc kubenswrapper[4837]: I1014 13:23:07.774748 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" event={"ID":"6217fcbf-8651-4d63-b670-71de72f5feed","Type":"ContainerStarted","Data":"2220175928b61e18456b9e5168e9afcc1e5a84b34fdd8cd10cd331b6bfa1e167"} Oct 14 13:23:08 crc kubenswrapper[4837]: I1014 13:23:08.817398 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" podStartSLOduration=2.232995464 podStartE2EDuration="2.817381786s" podCreationTimestamp="2025-10-14 13:23:06 +0000 UTC" firstStartedPulling="2025-10-14 13:23:07.747060217 +0000 UTC m=+1325.664060040" lastFinishedPulling="2025-10-14 13:23:08.331446549 +0000 UTC m=+1326.248446362" observedRunningTime="2025-10-14 13:23:08.807839769 +0000 UTC m=+1326.724839592" watchObservedRunningTime="2025-10-14 13:23:08.817381786 +0000 UTC m=+1326.734381599" Oct 14 13:23:08 crc kubenswrapper[4837]: I1014 13:23:08.824389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" event={"ID":"6217fcbf-8651-4d63-b670-71de72f5feed","Type":"ContainerStarted","Data":"7d9814e2eb783ac77a91666316b9e71daf597134270e5cec8b80ff1ca571dc04"} Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.140369 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.140802 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.140897 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.142095 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c044b8ea9bc069679094c7a3872ef16c9931631e466b1c5d874f80fa606522e9"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.142369 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://c044b8ea9bc069679094c7a3872ef16c9931631e466b1c5d874f80fa606522e9" gracePeriod=600 Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.816678 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="c044b8ea9bc069679094c7a3872ef16c9931631e466b1c5d874f80fa606522e9" exitCode=0 Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.816762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"c044b8ea9bc069679094c7a3872ef16c9931631e466b1c5d874f80fa606522e9"} Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.817840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b"} Oct 14 13:23:11 crc kubenswrapper[4837]: I1014 13:23:11.817860 4837 scope.go:117] "RemoveContainer" containerID="8d59e2770d58989e05b8415f7e303806aa7b4a2d0d357770adcc21ef0909284d" Oct 14 13:24:05 crc kubenswrapper[4837]: I1014 13:24:05.206966 4837 scope.go:117] "RemoveContainer" containerID="74dc92c73db3394647396a31b6fa40ed645d1a6841924c27c9136d65e22ce508" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.649810 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fclm"] Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.658841 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.661017 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fclm"] Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.843732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ggcc\" (UniqueName: \"kubernetes.io/projected/75d69c73-7035-47cb-9f2d-50685e313123-kube-api-access-9ggcc\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.844470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-utilities\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.844494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-catalog-content\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.946196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ggcc\" (UniqueName: \"kubernetes.io/projected/75d69c73-7035-47cb-9f2d-50685e313123-kube-api-access-9ggcc\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.946273 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-utilities\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.946295 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-catalog-content\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.946912 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-catalog-content\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.946931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-utilities\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.970127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ggcc\" (UniqueName: \"kubernetes.io/projected/75d69c73-7035-47cb-9f2d-50685e313123-kube-api-access-9ggcc\") pod \"community-operators-9fclm\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:57 crc kubenswrapper[4837]: I1014 13:24:57.994366 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:24:58 crc kubenswrapper[4837]: I1014 13:24:58.508665 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fclm"] Oct 14 13:24:59 crc kubenswrapper[4837]: I1014 13:24:59.002558 4837 generic.go:334] "Generic (PLEG): container finished" podID="75d69c73-7035-47cb-9f2d-50685e313123" containerID="33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e" exitCode=0 Oct 14 13:24:59 crc kubenswrapper[4837]: I1014 13:24:59.002677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fclm" event={"ID":"75d69c73-7035-47cb-9f2d-50685e313123","Type":"ContainerDied","Data":"33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e"} Oct 14 13:24:59 crc kubenswrapper[4837]: I1014 13:24:59.002977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fclm" event={"ID":"75d69c73-7035-47cb-9f2d-50685e313123","Type":"ContainerStarted","Data":"d9e3acc0f6b3a2c154e57b4c73a4a260beae7c955aa89f91cd524fb04071c89e"} Oct 14 13:25:01 crc kubenswrapper[4837]: I1014 13:25:01.026278 4837 generic.go:334] "Generic (PLEG): container finished" podID="75d69c73-7035-47cb-9f2d-50685e313123" containerID="ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf" exitCode=0 Oct 14 13:25:01 crc kubenswrapper[4837]: I1014 13:25:01.026378 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fclm" event={"ID":"75d69c73-7035-47cb-9f2d-50685e313123","Type":"ContainerDied","Data":"ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf"} Oct 14 13:25:03 crc kubenswrapper[4837]: I1014 13:25:03.079593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fclm" event={"ID":"75d69c73-7035-47cb-9f2d-50685e313123","Type":"ContainerStarted","Data":"05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659"} Oct 14 13:25:03 crc kubenswrapper[4837]: I1014 13:25:03.106349 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fclm" podStartSLOduration=3.304488728 podStartE2EDuration="6.106330092s" podCreationTimestamp="2025-10-14 13:24:57 +0000 UTC" firstStartedPulling="2025-10-14 13:24:59.004653388 +0000 UTC m=+1436.921653201" lastFinishedPulling="2025-10-14 13:25:01.806494722 +0000 UTC m=+1439.723494565" observedRunningTime="2025-10-14 13:25:03.102739086 +0000 UTC m=+1441.019738909" watchObservedRunningTime="2025-10-14 13:25:03.106330092 +0000 UTC m=+1441.023329915" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.168212 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k658r"] Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.173125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.189074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k45h\" (UniqueName: \"kubernetes.io/projected/d0a17819-39a7-4ed3-9e43-53fe2fe74540-kube-api-access-5k45h\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.189243 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-catalog-content\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.189349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-utilities\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.189632 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k658r"] Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.296252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k45h\" (UniqueName: \"kubernetes.io/projected/d0a17819-39a7-4ed3-9e43-53fe2fe74540-kube-api-access-5k45h\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.296568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-catalog-content\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.296614 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-utilities\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.297123 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-utilities\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.297405 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-catalog-content\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.321213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k45h\" (UniqueName: \"kubernetes.io/projected/d0a17819-39a7-4ed3-9e43-53fe2fe74540-kube-api-access-5k45h\") pod \"redhat-marketplace-k658r\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.509271 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:05 crc kubenswrapper[4837]: I1014 13:25:05.987683 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k658r"] Oct 14 13:25:06 crc kubenswrapper[4837]: I1014 13:25:06.113372 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k658r" event={"ID":"d0a17819-39a7-4ed3-9e43-53fe2fe74540","Type":"ContainerStarted","Data":"087ba207cfa12a0069a164f146a98a4108f9fbbf9dc2c3d95b8507aa6f3d7b80"} Oct 14 13:25:07 crc kubenswrapper[4837]: I1014 13:25:07.125344 4837 generic.go:334] "Generic (PLEG): container finished" podID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerID="a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d" exitCode=0 Oct 14 13:25:07 crc kubenswrapper[4837]: I1014 13:25:07.125423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k658r" event={"ID":"d0a17819-39a7-4ed3-9e43-53fe2fe74540","Type":"ContainerDied","Data":"a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d"} Oct 14 13:25:07 crc kubenswrapper[4837]: I1014 13:25:07.995097 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:25:07 crc kubenswrapper[4837]: I1014 13:25:07.995468 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:25:08 crc kubenswrapper[4837]: I1014 13:25:08.078718 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:25:08 crc kubenswrapper[4837]: I1014 13:25:08.200092 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:25:09 crc kubenswrapper[4837]: I1014 13:25:09.148316 4837 generic.go:334] "Generic (PLEG): container finished" podID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerID="5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed" exitCode=0 Oct 14 13:25:09 crc kubenswrapper[4837]: I1014 13:25:09.148420 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k658r" event={"ID":"d0a17819-39a7-4ed3-9e43-53fe2fe74540","Type":"ContainerDied","Data":"5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed"} Oct 14 13:25:09 crc kubenswrapper[4837]: I1014 13:25:09.729848 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fclm"] Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.162938 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k658r" event={"ID":"d0a17819-39a7-4ed3-9e43-53fe2fe74540","Type":"ContainerStarted","Data":"d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c"} Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.163030 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9fclm" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="registry-server" containerID="cri-o://05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659" gracePeriod=2 Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.189358 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k658r" podStartSLOduration=2.674648384 podStartE2EDuration="5.189339595s" podCreationTimestamp="2025-10-14 13:25:05 +0000 UTC" firstStartedPulling="2025-10-14 13:25:07.13095764 +0000 UTC m=+1445.047957473" lastFinishedPulling="2025-10-14 13:25:09.645648871 +0000 UTC m=+1447.562648684" observedRunningTime="2025-10-14 13:25:10.188291057 +0000 UTC m=+1448.105290870" watchObservedRunningTime="2025-10-14 13:25:10.189339595 +0000 UTC m=+1448.106339408" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.614272 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.805862 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-catalog-content\") pod \"75d69c73-7035-47cb-9f2d-50685e313123\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.806286 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ggcc\" (UniqueName: \"kubernetes.io/projected/75d69c73-7035-47cb-9f2d-50685e313123-kube-api-access-9ggcc\") pod \"75d69c73-7035-47cb-9f2d-50685e313123\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.806555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-utilities\") pod \"75d69c73-7035-47cb-9f2d-50685e313123\" (UID: \"75d69c73-7035-47cb-9f2d-50685e313123\") " Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.807646 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-utilities" (OuterVolumeSpecName: "utilities") pod "75d69c73-7035-47cb-9f2d-50685e313123" (UID: "75d69c73-7035-47cb-9f2d-50685e313123"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.814612 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d69c73-7035-47cb-9f2d-50685e313123-kube-api-access-9ggcc" (OuterVolumeSpecName: "kube-api-access-9ggcc") pod "75d69c73-7035-47cb-9f2d-50685e313123" (UID: "75d69c73-7035-47cb-9f2d-50685e313123"). InnerVolumeSpecName "kube-api-access-9ggcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.877673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75d69c73-7035-47cb-9f2d-50685e313123" (UID: "75d69c73-7035-47cb-9f2d-50685e313123"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.909987 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.910251 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ggcc\" (UniqueName: \"kubernetes.io/projected/75d69c73-7035-47cb-9f2d-50685e313123-kube-api-access-9ggcc\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:10 crc kubenswrapper[4837]: I1014 13:25:10.910399 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d69c73-7035-47cb-9f2d-50685e313123-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.140668 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.140762 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.179766 4837 generic.go:334] "Generic (PLEG): container finished" podID="75d69c73-7035-47cb-9f2d-50685e313123" containerID="05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659" exitCode=0 Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.179845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fclm" event={"ID":"75d69c73-7035-47cb-9f2d-50685e313123","Type":"ContainerDied","Data":"05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659"} Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.179902 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fclm" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.179938 4837 scope.go:117] "RemoveContainer" containerID="05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.179919 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fclm" event={"ID":"75d69c73-7035-47cb-9f2d-50685e313123","Type":"ContainerDied","Data":"d9e3acc0f6b3a2c154e57b4c73a4a260beae7c955aa89f91cd524fb04071c89e"} Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.226057 4837 scope.go:117] "RemoveContainer" containerID="ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.233700 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fclm"] Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.243585 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9fclm"] Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.280752 4837 scope.go:117] "RemoveContainer" containerID="33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.320187 4837 scope.go:117] "RemoveContainer" containerID="05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659" Oct 14 13:25:11 crc kubenswrapper[4837]: E1014 13:25:11.320761 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659\": container with ID starting with 05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659 not found: ID does not exist" containerID="05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.320835 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659"} err="failed to get container status \"05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659\": rpc error: code = NotFound desc = could not find container \"05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659\": container with ID starting with 05c24d95ccff318861ac929dd99abebc6c44ee090ee9ac6f0710aabe7739e659 not found: ID does not exist" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.320882 4837 scope.go:117] "RemoveContainer" containerID="ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf" Oct 14 13:25:11 crc kubenswrapper[4837]: E1014 13:25:11.321413 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf\": container with ID starting with ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf not found: ID does not exist" containerID="ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.321469 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf"} err="failed to get container status \"ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf\": rpc error: code = NotFound desc = could not find container \"ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf\": container with ID starting with ac6c1dfd45b18be8b07952cfabbf5c2ddf1a3bd4d88b1d3f85051fac482946bf not found: ID does not exist" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.321506 4837 scope.go:117] "RemoveContainer" containerID="33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e" Oct 14 13:25:11 crc kubenswrapper[4837]: E1014 13:25:11.321818 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e\": container with ID starting with 33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e not found: ID does not exist" containerID="33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e" Oct 14 13:25:11 crc kubenswrapper[4837]: I1014 13:25:11.321867 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e"} err="failed to get container status \"33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e\": rpc error: code = NotFound desc = could not find container \"33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e\": container with ID starting with 33eb67a14533c4c0c3abb523dca82cfa23623d02be8c00691b239a522254a57e not found: ID does not exist" Oct 14 13:25:12 crc kubenswrapper[4837]: I1014 13:25:12.806020 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d69c73-7035-47cb-9f2d-50685e313123" path="/var/lib/kubelet/pods/75d69c73-7035-47cb-9f2d-50685e313123/volumes" Oct 14 13:25:15 crc kubenswrapper[4837]: I1014 13:25:15.510119 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:15 crc kubenswrapper[4837]: I1014 13:25:15.510659 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:15 crc kubenswrapper[4837]: I1014 13:25:15.593593 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:16 crc kubenswrapper[4837]: I1014 13:25:16.314992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:16 crc kubenswrapper[4837]: I1014 13:25:16.378679 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k658r"] Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.273126 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k658r" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="registry-server" containerID="cri-o://d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c" gracePeriod=2 Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.709899 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.896333 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-catalog-content\") pod \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.896759 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k45h\" (UniqueName: \"kubernetes.io/projected/d0a17819-39a7-4ed3-9e43-53fe2fe74540-kube-api-access-5k45h\") pod \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.896851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-utilities\") pod \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\" (UID: \"d0a17819-39a7-4ed3-9e43-53fe2fe74540\") " Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.897790 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-utilities" (OuterVolumeSpecName: "utilities") pod "d0a17819-39a7-4ed3-9e43-53fe2fe74540" (UID: "d0a17819-39a7-4ed3-9e43-53fe2fe74540"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.904495 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a17819-39a7-4ed3-9e43-53fe2fe74540-kube-api-access-5k45h" (OuterVolumeSpecName: "kube-api-access-5k45h") pod "d0a17819-39a7-4ed3-9e43-53fe2fe74540" (UID: "d0a17819-39a7-4ed3-9e43-53fe2fe74540"). InnerVolumeSpecName "kube-api-access-5k45h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.912029 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0a17819-39a7-4ed3-9e43-53fe2fe74540" (UID: "d0a17819-39a7-4ed3-9e43-53fe2fe74540"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.999699 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.999734 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k45h\" (UniqueName: \"kubernetes.io/projected/d0a17819-39a7-4ed3-9e43-53fe2fe74540-kube-api-access-5k45h\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:18 crc kubenswrapper[4837]: I1014 13:25:18.999745 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a17819-39a7-4ed3-9e43-53fe2fe74540-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.286508 4837 generic.go:334] "Generic (PLEG): container finished" podID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerID="d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c" exitCode=0 Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.286569 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k658r" event={"ID":"d0a17819-39a7-4ed3-9e43-53fe2fe74540","Type":"ContainerDied","Data":"d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c"} Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.286608 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k658r" event={"ID":"d0a17819-39a7-4ed3-9e43-53fe2fe74540","Type":"ContainerDied","Data":"087ba207cfa12a0069a164f146a98a4108f9fbbf9dc2c3d95b8507aa6f3d7b80"} Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.286630 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k658r" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.286639 4837 scope.go:117] "RemoveContainer" containerID="d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.322514 4837 scope.go:117] "RemoveContainer" containerID="5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.329470 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k658r"] Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.340308 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k658r"] Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.353309 4837 scope.go:117] "RemoveContainer" containerID="a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.408806 4837 scope.go:117] "RemoveContainer" containerID="d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c" Oct 14 13:25:19 crc kubenswrapper[4837]: E1014 13:25:19.409259 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c\": container with ID starting with d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c not found: ID does not exist" containerID="d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.409302 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c"} err="failed to get container status \"d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c\": rpc error: code = NotFound desc = could not find container \"d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c\": container with ID starting with d7b0c33a014c0b8f1b3ae25d36c8dc95b3ad3ea093e34f8791d56fa2298ec89c not found: ID does not exist" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.409559 4837 scope.go:117] "RemoveContainer" containerID="5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed" Oct 14 13:25:19 crc kubenswrapper[4837]: E1014 13:25:19.409915 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed\": container with ID starting with 5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed not found: ID does not exist" containerID="5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.410048 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed"} err="failed to get container status \"5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed\": rpc error: code = NotFound desc = could not find container \"5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed\": container with ID starting with 5bdede6578a6f4646c9dfe7d5a26bb6fde7644960082a5b7bb740d4d721dabed not found: ID does not exist" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.410173 4837 scope.go:117] "RemoveContainer" containerID="a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d" Oct 14 13:25:19 crc kubenswrapper[4837]: E1014 13:25:19.410560 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d\": container with ID starting with a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d not found: ID does not exist" containerID="a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d" Oct 14 13:25:19 crc kubenswrapper[4837]: I1014 13:25:19.410602 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d"} err="failed to get container status \"a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d\": rpc error: code = NotFound desc = could not find container \"a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d\": container with ID starting with a389179d9ff5a12b9506b6e6a5d4f642246657778ebb5c9bbc87edc9a836493d not found: ID does not exist" Oct 14 13:25:20 crc kubenswrapper[4837]: I1014 13:25:20.801421 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" path="/var/lib/kubelet/pods/d0a17819-39a7-4ed3-9e43-53fe2fe74540/volumes" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.476523 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgjp2"] Oct 14 13:25:22 crc kubenswrapper[4837]: E1014 13:25:22.477526 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="registry-server" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.477550 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="registry-server" Oct 14 13:25:22 crc kubenswrapper[4837]: E1014 13:25:22.477599 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="extract-content" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.477609 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="extract-content" Oct 14 13:25:22 crc kubenswrapper[4837]: E1014 13:25:22.477626 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="extract-utilities" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.477636 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="extract-utilities" Oct 14 13:25:22 crc kubenswrapper[4837]: E1014 13:25:22.477667 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="registry-server" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.477677 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="registry-server" Oct 14 13:25:22 crc kubenswrapper[4837]: E1014 13:25:22.477700 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="extract-utilities" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.477710 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="extract-utilities" Oct 14 13:25:22 crc kubenswrapper[4837]: E1014 13:25:22.477729 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="extract-content" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.477739 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="extract-content" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.478039 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a17819-39a7-4ed3-9e43-53fe2fe74540" containerName="registry-server" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.478118 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d69c73-7035-47cb-9f2d-50685e313123" containerName="registry-server" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.480351 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.501427 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgjp2"] Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.572674 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtdz\" (UniqueName: \"kubernetes.io/projected/f183884f-81a9-4494-8462-7291281a9581-kube-api-access-krtdz\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.572870 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-utilities\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.573148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-catalog-content\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.675251 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krtdz\" (UniqueName: \"kubernetes.io/projected/f183884f-81a9-4494-8462-7291281a9581-kube-api-access-krtdz\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.675346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-utilities\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.675461 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-catalog-content\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.676221 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-utilities\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.676241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-catalog-content\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.703022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtdz\" (UniqueName: \"kubernetes.io/projected/f183884f-81a9-4494-8462-7291281a9581-kube-api-access-krtdz\") pod \"certified-operators-kgjp2\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:22 crc kubenswrapper[4837]: I1014 13:25:22.802511 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:23 crc kubenswrapper[4837]: I1014 13:25:23.402124 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgjp2"] Oct 14 13:25:23 crc kubenswrapper[4837]: W1014 13:25:23.411740 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf183884f_81a9_4494_8462_7291281a9581.slice/crio-e6154f99ae7d94f6e06aab1ad72d5483ee8fa106a0f5e08c6c4d414b0f00c816 WatchSource:0}: Error finding container e6154f99ae7d94f6e06aab1ad72d5483ee8fa106a0f5e08c6c4d414b0f00c816: Status 404 returned error can't find the container with id e6154f99ae7d94f6e06aab1ad72d5483ee8fa106a0f5e08c6c4d414b0f00c816 Oct 14 13:25:24 crc kubenswrapper[4837]: I1014 13:25:24.368600 4837 generic.go:334] "Generic (PLEG): container finished" podID="f183884f-81a9-4494-8462-7291281a9581" containerID="3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523" exitCode=0 Oct 14 13:25:24 crc kubenswrapper[4837]: I1014 13:25:24.368686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerDied","Data":"3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523"} Oct 14 13:25:24 crc kubenswrapper[4837]: I1014 13:25:24.369206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerStarted","Data":"e6154f99ae7d94f6e06aab1ad72d5483ee8fa106a0f5e08c6c4d414b0f00c816"} Oct 14 13:25:25 crc kubenswrapper[4837]: I1014 13:25:25.385409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerStarted","Data":"62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90"} Oct 14 13:25:26 crc kubenswrapper[4837]: I1014 13:25:26.407438 4837 generic.go:334] "Generic (PLEG): container finished" podID="f183884f-81a9-4494-8462-7291281a9581" containerID="62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90" exitCode=0 Oct 14 13:25:26 crc kubenswrapper[4837]: I1014 13:25:26.407847 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerDied","Data":"62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90"} Oct 14 13:25:27 crc kubenswrapper[4837]: I1014 13:25:27.418503 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerStarted","Data":"723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716"} Oct 14 13:25:27 crc kubenswrapper[4837]: I1014 13:25:27.446309 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgjp2" podStartSLOduration=2.949660266 podStartE2EDuration="5.446282514s" podCreationTimestamp="2025-10-14 13:25:22 +0000 UTC" firstStartedPulling="2025-10-14 13:25:24.373323848 +0000 UTC m=+1462.290323661" lastFinishedPulling="2025-10-14 13:25:26.869946076 +0000 UTC m=+1464.786945909" observedRunningTime="2025-10-14 13:25:27.436699818 +0000 UTC m=+1465.353699641" watchObservedRunningTime="2025-10-14 13:25:27.446282514 +0000 UTC m=+1465.363282327" Oct 14 13:25:32 crc kubenswrapper[4837]: I1014 13:25:32.804213 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:32 crc kubenswrapper[4837]: I1014 13:25:32.804653 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:32 crc kubenswrapper[4837]: I1014 13:25:32.861977 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:33 crc kubenswrapper[4837]: I1014 13:25:33.516000 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:33 crc kubenswrapper[4837]: I1014 13:25:33.574955 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgjp2"] Oct 14 13:25:35 crc kubenswrapper[4837]: I1014 13:25:35.488422 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kgjp2" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="registry-server" containerID="cri-o://723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716" gracePeriod=2 Oct 14 13:25:35 crc kubenswrapper[4837]: I1014 13:25:35.950938 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.062768 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-utilities\") pod \"f183884f-81a9-4494-8462-7291281a9581\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.062921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-catalog-content\") pod \"f183884f-81a9-4494-8462-7291281a9581\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.063246 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krtdz\" (UniqueName: \"kubernetes.io/projected/f183884f-81a9-4494-8462-7291281a9581-kube-api-access-krtdz\") pod \"f183884f-81a9-4494-8462-7291281a9581\" (UID: \"f183884f-81a9-4494-8462-7291281a9581\") " Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.063585 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-utilities" (OuterVolumeSpecName: "utilities") pod "f183884f-81a9-4494-8462-7291281a9581" (UID: "f183884f-81a9-4494-8462-7291281a9581"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.063818 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.069197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f183884f-81a9-4494-8462-7291281a9581-kube-api-access-krtdz" (OuterVolumeSpecName: "kube-api-access-krtdz") pod "f183884f-81a9-4494-8462-7291281a9581" (UID: "f183884f-81a9-4494-8462-7291281a9581"). InnerVolumeSpecName "kube-api-access-krtdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.108341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f183884f-81a9-4494-8462-7291281a9581" (UID: "f183884f-81a9-4494-8462-7291281a9581"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.165234 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krtdz\" (UniqueName: \"kubernetes.io/projected/f183884f-81a9-4494-8462-7291281a9581-kube-api-access-krtdz\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.165791 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f183884f-81a9-4494-8462-7291281a9581-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.502837 4837 generic.go:334] "Generic (PLEG): container finished" podID="f183884f-81a9-4494-8462-7291281a9581" containerID="723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716" exitCode=0 Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.502897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerDied","Data":"723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716"} Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.502917 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgjp2" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.502948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgjp2" event={"ID":"f183884f-81a9-4494-8462-7291281a9581","Type":"ContainerDied","Data":"e6154f99ae7d94f6e06aab1ad72d5483ee8fa106a0f5e08c6c4d414b0f00c816"} Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.502978 4837 scope.go:117] "RemoveContainer" containerID="723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.534649 4837 scope.go:117] "RemoveContainer" containerID="62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.547903 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgjp2"] Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.555433 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kgjp2"] Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.570889 4837 scope.go:117] "RemoveContainer" containerID="3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.632109 4837 scope.go:117] "RemoveContainer" containerID="723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716" Oct 14 13:25:36 crc kubenswrapper[4837]: E1014 13:25:36.632902 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716\": container with ID starting with 723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716 not found: ID does not exist" containerID="723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.632939 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716"} err="failed to get container status \"723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716\": rpc error: code = NotFound desc = could not find container \"723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716\": container with ID starting with 723d44c35e7d961a40e50dea398cf29d299ae925176874c596f301d8a593c716 not found: ID does not exist" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.632965 4837 scope.go:117] "RemoveContainer" containerID="62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90" Oct 14 13:25:36 crc kubenswrapper[4837]: E1014 13:25:36.633297 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90\": container with ID starting with 62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90 not found: ID does not exist" containerID="62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.633330 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90"} err="failed to get container status \"62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90\": rpc error: code = NotFound desc = could not find container \"62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90\": container with ID starting with 62f0c777a230946b9b2b97d25d62b2d9b05974a3f66916e69148538b79912b90 not found: ID does not exist" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.633351 4837 scope.go:117] "RemoveContainer" containerID="3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523" Oct 14 13:25:36 crc kubenswrapper[4837]: E1014 13:25:36.633766 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523\": container with ID starting with 3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523 not found: ID does not exist" containerID="3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.633794 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523"} err="failed to get container status \"3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523\": rpc error: code = NotFound desc = could not find container \"3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523\": container with ID starting with 3a0ff61e891f3fbcd1b79a3622c72b25683713dfac59db1db4011ffd7ab4e523 not found: ID does not exist" Oct 14 13:25:36 crc kubenswrapper[4837]: I1014 13:25:36.797179 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f183884f-81a9-4494-8462-7291281a9581" path="/var/lib/kubelet/pods/f183884f-81a9-4494-8462-7291281a9581/volumes" Oct 14 13:25:41 crc kubenswrapper[4837]: I1014 13:25:41.140631 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:25:41 crc kubenswrapper[4837]: I1014 13:25:41.141249 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.139655 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.140137 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.140243 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.140996 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.141053 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" gracePeriod=600 Oct 14 13:26:11 crc kubenswrapper[4837]: E1014 13:26:11.266298 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.869404 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" exitCode=0 Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.869461 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b"} Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.869706 4837 scope.go:117] "RemoveContainer" containerID="c044b8ea9bc069679094c7a3872ef16c9931631e466b1c5d874f80fa606522e9" Oct 14 13:26:11 crc kubenswrapper[4837]: I1014 13:26:11.870617 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:26:11 crc kubenswrapper[4837]: E1014 13:26:11.871008 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:26:15 crc kubenswrapper[4837]: I1014 13:26:15.909922 4837 generic.go:334] "Generic (PLEG): container finished" podID="6217fcbf-8651-4d63-b670-71de72f5feed" containerID="7d9814e2eb783ac77a91666316b9e71daf597134270e5cec8b80ff1ca571dc04" exitCode=0 Oct 14 13:26:15 crc kubenswrapper[4837]: I1014 13:26:15.909995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" event={"ID":"6217fcbf-8651-4d63-b670-71de72f5feed","Type":"ContainerDied","Data":"7d9814e2eb783ac77a91666316b9e71daf597134270e5cec8b80ff1ca571dc04"} Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.051607 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-smpbv"] Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.064800 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qm4ns"] Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.083137 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rzn49"] Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.098620 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qm4ns"] Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.109233 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rzn49"] Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.119437 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-smpbv"] Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.319331 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.509113 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-ssh-key\") pod \"6217fcbf-8651-4d63-b670-71de72f5feed\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.509926 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqztm\" (UniqueName: \"kubernetes.io/projected/6217fcbf-8651-4d63-b670-71de72f5feed-kube-api-access-cqztm\") pod \"6217fcbf-8651-4d63-b670-71de72f5feed\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.510040 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-inventory\") pod \"6217fcbf-8651-4d63-b670-71de72f5feed\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.510103 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-bootstrap-combined-ca-bundle\") pod \"6217fcbf-8651-4d63-b670-71de72f5feed\" (UID: \"6217fcbf-8651-4d63-b670-71de72f5feed\") " Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.515295 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6217fcbf-8651-4d63-b670-71de72f5feed-kube-api-access-cqztm" (OuterVolumeSpecName: "kube-api-access-cqztm") pod "6217fcbf-8651-4d63-b670-71de72f5feed" (UID: "6217fcbf-8651-4d63-b670-71de72f5feed"). InnerVolumeSpecName "kube-api-access-cqztm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.519126 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6217fcbf-8651-4d63-b670-71de72f5feed" (UID: "6217fcbf-8651-4d63-b670-71de72f5feed"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.545748 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6217fcbf-8651-4d63-b670-71de72f5feed" (UID: "6217fcbf-8651-4d63-b670-71de72f5feed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.561901 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-inventory" (OuterVolumeSpecName: "inventory") pod "6217fcbf-8651-4d63-b670-71de72f5feed" (UID: "6217fcbf-8651-4d63-b670-71de72f5feed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.612856 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.612899 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqztm\" (UniqueName: \"kubernetes.io/projected/6217fcbf-8651-4d63-b670-71de72f5feed-kube-api-access-cqztm\") on node \"crc\" DevicePath \"\"" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.612916 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.612929 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6217fcbf-8651-4d63-b670-71de72f5feed-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.929710 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" event={"ID":"6217fcbf-8651-4d63-b670-71de72f5feed","Type":"ContainerDied","Data":"2220175928b61e18456b9e5168e9afcc1e5a84b34fdd8cd10cd331b6bfa1e167"} Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.930268 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2220175928b61e18456b9e5168e9afcc1e5a84b34fdd8cd10cd331b6bfa1e167" Oct 14 13:26:17 crc kubenswrapper[4837]: I1014 13:26:17.929817 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.015753 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx"] Oct 14 13:26:18 crc kubenswrapper[4837]: E1014 13:26:18.016501 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6217fcbf-8651-4d63-b670-71de72f5feed" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.016597 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6217fcbf-8651-4d63-b670-71de72f5feed" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 14 13:26:18 crc kubenswrapper[4837]: E1014 13:26:18.016687 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="extract-content" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.016751 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="extract-content" Oct 14 13:26:18 crc kubenswrapper[4837]: E1014 13:26:18.016836 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="extract-utilities" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.016901 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="extract-utilities" Oct 14 13:26:18 crc kubenswrapper[4837]: E1014 13:26:18.017005 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="registry-server" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.017082 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="registry-server" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.017438 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6217fcbf-8651-4d63-b670-71de72f5feed" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.017532 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f183884f-81a9-4494-8462-7291281a9581" containerName="registry-server" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.018347 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.024387 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.024417 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.024494 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.025500 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.033014 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx"] Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.125941 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.126243 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfh7b\" (UniqueName: \"kubernetes.io/projected/714cea27-46ab-4d03-b5d1-81b42d99f6f6-kube-api-access-kfh7b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.126600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.228187 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.228248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.228312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfh7b\" (UniqueName: \"kubernetes.io/projected/714cea27-46ab-4d03-b5d1-81b42d99f6f6-kube-api-access-kfh7b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.232618 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.234397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.253020 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfh7b\" (UniqueName: \"kubernetes.io/projected/714cea27-46ab-4d03-b5d1-81b42d99f6f6-kube-api-access-kfh7b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-shsmx\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.348727 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.799939 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b9af55-6774-4230-a7c1-ef5b49d1ab29" path="/var/lib/kubelet/pods/28b9af55-6774-4230-a7c1-ef5b49d1ab29/volumes" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.803024 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c60778-c2e2-4d03-bbfa-f68e1e8b8c22" path="/var/lib/kubelet/pods/36c60778-c2e2-4d03-bbfa-f68e1e8b8c22/volumes" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.804619 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47bcfb5e-cbb6-4626-9264-93e2be887f1c" path="/var/lib/kubelet/pods/47bcfb5e-cbb6-4626-9264-93e2be887f1c/volumes" Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.832554 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx"] Oct 14 13:26:18 crc kubenswrapper[4837]: W1014 13:26:18.841809 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod714cea27_46ab_4d03_b5d1_81b42d99f6f6.slice/crio-c0ecf34f2c81d8393fb98291ef1c8a296b7a82e852fe9fcfbb72b30b0252f6b9 WatchSource:0}: Error finding container c0ecf34f2c81d8393fb98291ef1c8a296b7a82e852fe9fcfbb72b30b0252f6b9: Status 404 returned error can't find the container with id c0ecf34f2c81d8393fb98291ef1c8a296b7a82e852fe9fcfbb72b30b0252f6b9 Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.845339 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:26:18 crc kubenswrapper[4837]: I1014 13:26:18.939839 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" event={"ID":"714cea27-46ab-4d03-b5d1-81b42d99f6f6","Type":"ContainerStarted","Data":"c0ecf34f2c81d8393fb98291ef1c8a296b7a82e852fe9fcfbb72b30b0252f6b9"} Oct 14 13:26:19 crc kubenswrapper[4837]: I1014 13:26:19.949924 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" event={"ID":"714cea27-46ab-4d03-b5d1-81b42d99f6f6","Type":"ContainerStarted","Data":"b6d876483f34132518261009eda0f0d4f19f358200620db6bbae46947749d0bf"} Oct 14 13:26:19 crc kubenswrapper[4837]: I1014 13:26:19.968600 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" podStartSLOduration=2.503830086 podStartE2EDuration="2.968579978s" podCreationTimestamp="2025-10-14 13:26:17 +0000 UTC" firstStartedPulling="2025-10-14 13:26:18.845036494 +0000 UTC m=+1516.762036307" lastFinishedPulling="2025-10-14 13:26:19.309786386 +0000 UTC m=+1517.226786199" observedRunningTime="2025-10-14 13:26:19.964609172 +0000 UTC m=+1517.881608985" watchObservedRunningTime="2025-10-14 13:26:19.968579978 +0000 UTC m=+1517.885579781" Oct 14 13:26:24 crc kubenswrapper[4837]: I1014 13:26:24.786510 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:26:24 crc kubenswrapper[4837]: E1014 13:26:24.787259 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:26:26 crc kubenswrapper[4837]: I1014 13:26:26.042862 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c6dd-account-create-xw7kc"] Oct 14 13:26:26 crc kubenswrapper[4837]: I1014 13:26:26.051083 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c6dd-account-create-xw7kc"] Oct 14 13:26:26 crc kubenswrapper[4837]: I1014 13:26:26.798142 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e" path="/var/lib/kubelet/pods/4d1fa01e-58a0-4877-b0c8-77e6f6de1f6e/volumes" Oct 14 13:26:35 crc kubenswrapper[4837]: I1014 13:26:35.785478 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:26:35 crc kubenswrapper[4837]: E1014 13:26:35.786250 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.029768 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d256-account-create-q8d4t"] Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.039898 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qkhg2"] Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.050329 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-96ae-account-create-qxl8c"] Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.057864 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d256-account-create-q8d4t"] Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.064836 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-96ae-account-create-qxl8c"] Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.071945 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qkhg2"] Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.801345 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33eb2486-60b1-4fc5-aeeb-1a1855693079" path="/var/lib/kubelet/pods/33eb2486-60b1-4fc5-aeeb-1a1855693079/volumes" Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.802681 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a07282f-11f0-40b7-af8f-32fd266b70de" path="/var/lib/kubelet/pods/9a07282f-11f0-40b7-af8f-32fd266b70de/volumes" Oct 14 13:26:44 crc kubenswrapper[4837]: I1014 13:26:44.803868 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fd4c34-4f7d-493b-b792-d3e20b82d5cc" path="/var/lib/kubelet/pods/c5fd4c34-4f7d-493b-b792-d3e20b82d5cc/volumes" Oct 14 13:26:45 crc kubenswrapper[4837]: I1014 13:26:45.040651 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7b2p6"] Oct 14 13:26:45 crc kubenswrapper[4837]: I1014 13:26:45.054400 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5n9j7"] Oct 14 13:26:45 crc kubenswrapper[4837]: I1014 13:26:45.065503 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5n9j7"] Oct 14 13:26:45 crc kubenswrapper[4837]: I1014 13:26:45.074715 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7b2p6"] Oct 14 13:26:46 crc kubenswrapper[4837]: I1014 13:26:46.816949 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="266f6dd3-65a1-49e5-a904-66ed929e8718" path="/var/lib/kubelet/pods/266f6dd3-65a1-49e5-a904-66ed929e8718/volumes" Oct 14 13:26:46 crc kubenswrapper[4837]: I1014 13:26:46.818037 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c" path="/var/lib/kubelet/pods/ff0b9396-170a-4cdd-b5d9-dbddfdb17f2c/volumes" Oct 14 13:26:47 crc kubenswrapper[4837]: I1014 13:26:47.784704 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:26:47 crc kubenswrapper[4837]: E1014 13:26:47.784967 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:26:53 crc kubenswrapper[4837]: I1014 13:26:53.032064 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-f7lcc"] Oct 14 13:26:53 crc kubenswrapper[4837]: I1014 13:26:53.044333 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-f7lcc"] Oct 14 13:26:54 crc kubenswrapper[4837]: I1014 13:26:54.800595 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ad4da0-18fa-4eba-990c-4c9c80d4ecdc" path="/var/lib/kubelet/pods/23ad4da0-18fa-4eba-990c-4c9c80d4ecdc/volumes" Oct 14 13:26:55 crc kubenswrapper[4837]: I1014 13:26:55.060359 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-eb27-account-create-mn6ht"] Oct 14 13:26:55 crc kubenswrapper[4837]: I1014 13:26:55.075036 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-398e-account-create-2f6jz"] Oct 14 13:26:55 crc kubenswrapper[4837]: I1014 13:26:55.083858 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9d23-account-create-kjvbn"] Oct 14 13:26:55 crc kubenswrapper[4837]: I1014 13:26:55.093119 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-eb27-account-create-mn6ht"] Oct 14 13:26:55 crc kubenswrapper[4837]: I1014 13:26:55.102428 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-398e-account-create-2f6jz"] Oct 14 13:26:55 crc kubenswrapper[4837]: I1014 13:26:55.111357 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9d23-account-create-kjvbn"] Oct 14 13:26:56 crc kubenswrapper[4837]: I1014 13:26:56.803693 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a9e520-b0fd-4651-9606-cbd533bea731" path="/var/lib/kubelet/pods/55a9e520-b0fd-4651-9606-cbd533bea731/volumes" Oct 14 13:26:56 crc kubenswrapper[4837]: I1014 13:26:56.805240 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ae92d9-bc5d-46a1-b9fb-68b3482dca91" path="/var/lib/kubelet/pods/88ae92d9-bc5d-46a1-b9fb-68b3482dca91/volumes" Oct 14 13:26:56 crc kubenswrapper[4837]: I1014 13:26:56.806456 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46f2d2f-de1a-48e3-b6dd-82e2780ac592" path="/var/lib/kubelet/pods/e46f2d2f-de1a-48e3-b6dd-82e2780ac592/volumes" Oct 14 13:27:00 crc kubenswrapper[4837]: I1014 13:27:00.784923 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:27:00 crc kubenswrapper[4837]: E1014 13:27:00.785760 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:27:04 crc kubenswrapper[4837]: I1014 13:27:04.030323 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wlzpt"] Oct 14 13:27:04 crc kubenswrapper[4837]: I1014 13:27:04.037321 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wlzpt"] Oct 14 13:27:04 crc kubenswrapper[4837]: I1014 13:27:04.798107 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b72d2be-c361-496e-9ea4-990b2e8f15a3" path="/var/lib/kubelet/pods/6b72d2be-c361-496e-9ea4-990b2e8f15a3/volumes" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.373308 4837 scope.go:117] "RemoveContainer" containerID="710c9649811fecf9d5f228584d4381aa08581192d962521ec584b60547009d3a" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.417346 4837 scope.go:117] "RemoveContainer" containerID="3005aebb5857896971649463f1bfd5e6d34b881d8dd10c31dd6936c0be8c2bde" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.444867 4837 scope.go:117] "RemoveContainer" containerID="733b1ee8462fb43cd7252bc85846af597025216a00c167e7ed28ee04e0ec454e" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.489947 4837 scope.go:117] "RemoveContainer" containerID="d4226ecec4a49d4a1f3189b44aa2b30eb4b02612f08b154c694d8fd65a28fb89" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.505624 4837 scope.go:117] "RemoveContainer" containerID="b9aa9ab4b24b292f43bb72b34a88b943c063f3802bfe41f80fec488f0da9134f" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.532611 4837 scope.go:117] "RemoveContainer" containerID="e8baea8a872c4f79e0ba7f7f73b864ec9d181a63b44ae45c6769b1ba70748b38" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.615693 4837 scope.go:117] "RemoveContainer" containerID="e4a55f3394f1cb547aa6a393bc2b03463e445b7d3398a6c16bad0f18eb331be1" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.661294 4837 scope.go:117] "RemoveContainer" containerID="999aab289e810a4bf2e8ae2d0732217e105c3af8a373dd907b66adea15b451f1" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.720005 4837 scope.go:117] "RemoveContainer" containerID="25aa3738b380d549592e12ec80b99ff42795198087c88184161d295bfe1a7e76" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.738601 4837 scope.go:117] "RemoveContainer" containerID="ab70080b6331a8ef61c921a1cd4a01149fda6956950a609e5b8d0f75db9cbd56" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.787336 4837 scope.go:117] "RemoveContainer" containerID="76b2d7a45c476911b83b1f90b813862ff9bf439f3b2c9fcf10f96f676a19071f" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.837150 4837 scope.go:117] "RemoveContainer" containerID="32f2264f100ce8e8620a783ddc1527cd23f2cb4c2f6ba2f0f5880c121cc16202" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.854464 4837 scope.go:117] "RemoveContainer" containerID="3d653ebce0d14223e36104910a9de58468ea7f218d46625b8420c021c1e3eb4d" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.902067 4837 scope.go:117] "RemoveContainer" containerID="d4541d32894d96ad481241adf9f042195e567a7506ae4ce4bcdb491726b0401a" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.927023 4837 scope.go:117] "RemoveContainer" containerID="6c5948fe3c127d0fb3be578025f4b48e00bf4e592a793ea27559812b57de7b38" Oct 14 13:27:05 crc kubenswrapper[4837]: I1014 13:27:05.945309 4837 scope.go:117] "RemoveContainer" containerID="70f41c47eed9c40e9856be2650fb7735fd88f63be1e1c9c94fef3d25b2e44ab1" Oct 14 13:27:06 crc kubenswrapper[4837]: I1014 13:27:06.010464 4837 scope.go:117] "RemoveContainer" containerID="38ba2d4649e5a96fe11d8a1cf49a7469c6fb3ed7e29fa35d236852eacac935c0" Oct 14 13:27:13 crc kubenswrapper[4837]: I1014 13:27:13.785312 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:27:13 crc kubenswrapper[4837]: E1014 13:27:13.786014 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:27:25 crc kubenswrapper[4837]: I1014 13:27:25.784434 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:27:25 crc kubenswrapper[4837]: E1014 13:27:25.785424 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:27:38 crc kubenswrapper[4837]: I1014 13:27:38.785391 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:27:38 crc kubenswrapper[4837]: E1014 13:27:38.786248 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:27:49 crc kubenswrapper[4837]: I1014 13:27:49.784829 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:27:49 crc kubenswrapper[4837]: E1014 13:27:49.786296 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:28:03 crc kubenswrapper[4837]: I1014 13:28:03.784494 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:28:03 crc kubenswrapper[4837]: E1014 13:28:03.785329 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:28:04 crc kubenswrapper[4837]: I1014 13:28:04.951817 4837 generic.go:334] "Generic (PLEG): container finished" podID="714cea27-46ab-4d03-b5d1-81b42d99f6f6" containerID="b6d876483f34132518261009eda0f0d4f19f358200620db6bbae46947749d0bf" exitCode=0 Oct 14 13:28:04 crc kubenswrapper[4837]: I1014 13:28:04.951898 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" event={"ID":"714cea27-46ab-4d03-b5d1-81b42d99f6f6","Type":"ContainerDied","Data":"b6d876483f34132518261009eda0f0d4f19f358200620db6bbae46947749d0bf"} Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.404416 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.466130 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-inventory\") pod \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.466307 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-ssh-key\") pod \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.466421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfh7b\" (UniqueName: \"kubernetes.io/projected/714cea27-46ab-4d03-b5d1-81b42d99f6f6-kube-api-access-kfh7b\") pod \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\" (UID: \"714cea27-46ab-4d03-b5d1-81b42d99f6f6\") " Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.474576 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714cea27-46ab-4d03-b5d1-81b42d99f6f6-kube-api-access-kfh7b" (OuterVolumeSpecName: "kube-api-access-kfh7b") pod "714cea27-46ab-4d03-b5d1-81b42d99f6f6" (UID: "714cea27-46ab-4d03-b5d1-81b42d99f6f6"). InnerVolumeSpecName "kube-api-access-kfh7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.510811 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-inventory" (OuterVolumeSpecName: "inventory") pod "714cea27-46ab-4d03-b5d1-81b42d99f6f6" (UID: "714cea27-46ab-4d03-b5d1-81b42d99f6f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.526102 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "714cea27-46ab-4d03-b5d1-81b42d99f6f6" (UID: "714cea27-46ab-4d03-b5d1-81b42d99f6f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.568666 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.568736 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/714cea27-46ab-4d03-b5d1-81b42d99f6f6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.568747 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfh7b\" (UniqueName: \"kubernetes.io/projected/714cea27-46ab-4d03-b5d1-81b42d99f6f6-kube-api-access-kfh7b\") on node \"crc\" DevicePath \"\"" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.974502 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" event={"ID":"714cea27-46ab-4d03-b5d1-81b42d99f6f6","Type":"ContainerDied","Data":"c0ecf34f2c81d8393fb98291ef1c8a296b7a82e852fe9fcfbb72b30b0252f6b9"} Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.974561 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ecf34f2c81d8393fb98291ef1c8a296b7a82e852fe9fcfbb72b30b0252f6b9" Oct 14 13:28:06 crc kubenswrapper[4837]: I1014 13:28:06.974633 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-shsmx" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.065081 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp"] Oct 14 13:28:07 crc kubenswrapper[4837]: E1014 13:28:07.065752 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714cea27-46ab-4d03-b5d1-81b42d99f6f6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.067563 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="714cea27-46ab-4d03-b5d1-81b42d99f6f6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.067907 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="714cea27-46ab-4d03-b5d1-81b42d99f6f6" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.068799 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.071419 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.071568 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.071646 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.072754 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.075309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp"] Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.075418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.075467 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.075546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84tf\" (UniqueName: \"kubernetes.io/projected/b523b3d5-ba31-4620-8287-055d6bc931cc-kube-api-access-r84tf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.178014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.178102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.178326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r84tf\" (UniqueName: \"kubernetes.io/projected/b523b3d5-ba31-4620-8287-055d6bc931cc-kube-api-access-r84tf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.184113 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.187948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.198248 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84tf\" (UniqueName: \"kubernetes.io/projected/b523b3d5-ba31-4620-8287-055d6bc931cc-kube-api-access-r84tf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.389830 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.916069 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp"] Oct 14 13:28:07 crc kubenswrapper[4837]: I1014 13:28:07.983316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" event={"ID":"b523b3d5-ba31-4620-8287-055d6bc931cc","Type":"ContainerStarted","Data":"6dd943472e25c15570069cbfced1e6b2e36fa670a3b317e3b36f37da18dc4118"} Oct 14 13:28:08 crc kubenswrapper[4837]: I1014 13:28:08.996536 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" event={"ID":"b523b3d5-ba31-4620-8287-055d6bc931cc","Type":"ContainerStarted","Data":"bed68454b9f693d725c8e8dfce016c5c66749fb45daa775f7124ab330144b25b"} Oct 14 13:28:09 crc kubenswrapper[4837]: I1014 13:28:09.019829 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" podStartSLOduration=1.585123148 podStartE2EDuration="2.019813138s" podCreationTimestamp="2025-10-14 13:28:07 +0000 UTC" firstStartedPulling="2025-10-14 13:28:07.925200715 +0000 UTC m=+1625.842200568" lastFinishedPulling="2025-10-14 13:28:08.359890745 +0000 UTC m=+1626.276890558" observedRunningTime="2025-10-14 13:28:09.01542354 +0000 UTC m=+1626.932423443" watchObservedRunningTime="2025-10-14 13:28:09.019813138 +0000 UTC m=+1626.936812941" Oct 14 13:28:14 crc kubenswrapper[4837]: I1014 13:28:14.785418 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:28:14 crc kubenswrapper[4837]: E1014 13:28:14.786638 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:28:26 crc kubenswrapper[4837]: I1014 13:28:26.784784 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:28:26 crc kubenswrapper[4837]: E1014 13:28:26.785500 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:28:28 crc kubenswrapper[4837]: I1014 13:28:28.040948 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7h69"] Oct 14 13:28:28 crc kubenswrapper[4837]: I1014 13:28:28.048282 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7h69"] Oct 14 13:28:28 crc kubenswrapper[4837]: I1014 13:28:28.807520 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2646558f-772d-41e3-8079-ae80e140a23a" path="/var/lib/kubelet/pods/2646558f-772d-41e3-8079-ae80e140a23a/volumes" Oct 14 13:28:39 crc kubenswrapper[4837]: I1014 13:28:39.784877 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:28:39 crc kubenswrapper[4837]: E1014 13:28:39.785835 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:28:44 crc kubenswrapper[4837]: I1014 13:28:44.046287 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jddnh"] Oct 14 13:28:44 crc kubenswrapper[4837]: I1014 13:28:44.055842 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jddnh"] Oct 14 13:28:44 crc kubenswrapper[4837]: I1014 13:28:44.802816 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd" path="/var/lib/kubelet/pods/d417ecbc-b3c6-4550-9d56-f1f13d2ef3bd/volumes" Oct 14 13:28:47 crc kubenswrapper[4837]: I1014 13:28:47.024122 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-b69bq"] Oct 14 13:28:47 crc kubenswrapper[4837]: I1014 13:28:47.033990 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-b69bq"] Oct 14 13:28:48 crc kubenswrapper[4837]: I1014 13:28:48.025932 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-5j7sn"] Oct 14 13:28:48 crc kubenswrapper[4837]: I1014 13:28:48.035451 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-5j7sn"] Oct 14 13:28:48 crc kubenswrapper[4837]: I1014 13:28:48.798624 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d2b3ef-eed6-48cb-948b-3618d6f53fff" path="/var/lib/kubelet/pods/14d2b3ef-eed6-48cb-948b-3618d6f53fff/volumes" Oct 14 13:28:48 crc kubenswrapper[4837]: I1014 13:28:48.800394 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d33f0a5-b130-4614-9636-fa0d61fa4e11" path="/var/lib/kubelet/pods/6d33f0a5-b130-4614-9636-fa0d61fa4e11/volumes" Oct 14 13:28:52 crc kubenswrapper[4837]: I1014 13:28:52.797443 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:28:52 crc kubenswrapper[4837]: E1014 13:28:52.798254 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:28:58 crc kubenswrapper[4837]: I1014 13:28:58.053478 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mwvdq"] Oct 14 13:28:58 crc kubenswrapper[4837]: I1014 13:28:58.073870 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mwvdq"] Oct 14 13:28:58 crc kubenswrapper[4837]: I1014 13:28:58.797156 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc6adfa-9f60-4e67-ba33-98badd63dd5f" path="/var/lib/kubelet/pods/3dc6adfa-9f60-4e67-ba33-98badd63dd5f/volumes" Oct 14 13:29:06 crc kubenswrapper[4837]: I1014 13:29:06.330647 4837 scope.go:117] "RemoveContainer" containerID="4d186e9c62731590c1feb726579528333c986b72bce2aa320d81395f9be8ff4c" Oct 14 13:29:06 crc kubenswrapper[4837]: I1014 13:29:06.364437 4837 scope.go:117] "RemoveContainer" containerID="c63042a8cb32ba40653be97d76b7967d87b91a763bd979e762f1d82d75694dd3" Oct 14 13:29:06 crc kubenswrapper[4837]: I1014 13:29:06.464961 4837 scope.go:117] "RemoveContainer" containerID="38e3d1774dbf6264443841868486acadd41cbf5d1855735693fcfd529af7260e" Oct 14 13:29:06 crc kubenswrapper[4837]: I1014 13:29:06.495917 4837 scope.go:117] "RemoveContainer" containerID="755881917e433eb82b4d7761844beaad423e0a39296429067ee6da8049a61511" Oct 14 13:29:06 crc kubenswrapper[4837]: I1014 13:29:06.554658 4837 scope.go:117] "RemoveContainer" containerID="7df6eeafaba5dada356f67f2737d60da0144aeedf8d39cde437bac7240312815" Oct 14 13:29:06 crc kubenswrapper[4837]: I1014 13:29:06.788677 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:29:06 crc kubenswrapper[4837]: E1014 13:29:06.789322 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:29:19 crc kubenswrapper[4837]: I1014 13:29:19.784901 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:29:19 crc kubenswrapper[4837]: E1014 13:29:19.786013 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:29:20 crc kubenswrapper[4837]: I1014 13:29:20.733921 4837 generic.go:334] "Generic (PLEG): container finished" podID="b523b3d5-ba31-4620-8287-055d6bc931cc" containerID="bed68454b9f693d725c8e8dfce016c5c66749fb45daa775f7124ab330144b25b" exitCode=0 Oct 14 13:29:20 crc kubenswrapper[4837]: I1014 13:29:20.733964 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" event={"ID":"b523b3d5-ba31-4620-8287-055d6bc931cc","Type":"ContainerDied","Data":"bed68454b9f693d725c8e8dfce016c5c66749fb45daa775f7124ab330144b25b"} Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.051566 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t886p"] Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.054062 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.061621 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t886p"] Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.150903 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-catalog-content\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.151653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-utilities\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.151687 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfdtb\" (UniqueName: \"kubernetes.io/projected/0fe08e69-e0ea-451b-a61f-7b62fdb90245-kube-api-access-rfdtb\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.161224 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.252799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-ssh-key\") pod \"b523b3d5-ba31-4620-8287-055d6bc931cc\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.252924 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-inventory\") pod \"b523b3d5-ba31-4620-8287-055d6bc931cc\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.252982 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r84tf\" (UniqueName: \"kubernetes.io/projected/b523b3d5-ba31-4620-8287-055d6bc931cc-kube-api-access-r84tf\") pod \"b523b3d5-ba31-4620-8287-055d6bc931cc\" (UID: \"b523b3d5-ba31-4620-8287-055d6bc931cc\") " Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.253364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-catalog-content\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.253495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-utilities\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.253526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfdtb\" (UniqueName: \"kubernetes.io/projected/0fe08e69-e0ea-451b-a61f-7b62fdb90245-kube-api-access-rfdtb\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.254112 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-catalog-content\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.254229 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-utilities\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.266753 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b523b3d5-ba31-4620-8287-055d6bc931cc-kube-api-access-r84tf" (OuterVolumeSpecName: "kube-api-access-r84tf") pod "b523b3d5-ba31-4620-8287-055d6bc931cc" (UID: "b523b3d5-ba31-4620-8287-055d6bc931cc"). InnerVolumeSpecName "kube-api-access-r84tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.269529 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfdtb\" (UniqueName: \"kubernetes.io/projected/0fe08e69-e0ea-451b-a61f-7b62fdb90245-kube-api-access-rfdtb\") pod \"redhat-operators-t886p\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.284523 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-inventory" (OuterVolumeSpecName: "inventory") pod "b523b3d5-ba31-4620-8287-055d6bc931cc" (UID: "b523b3d5-ba31-4620-8287-055d6bc931cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.296990 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b523b3d5-ba31-4620-8287-055d6bc931cc" (UID: "b523b3d5-ba31-4620-8287-055d6bc931cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.355226 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.355262 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b523b3d5-ba31-4620-8287-055d6bc931cc-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.355272 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r84tf\" (UniqueName: \"kubernetes.io/projected/b523b3d5-ba31-4620-8287-055d6bc931cc-kube-api-access-r84tf\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.456173 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.759821 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" event={"ID":"b523b3d5-ba31-4620-8287-055d6bc931cc","Type":"ContainerDied","Data":"6dd943472e25c15570069cbfced1e6b2e36fa670a3b317e3b36f37da18dc4118"} Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.760177 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd943472e25c15570069cbfced1e6b2e36fa670a3b317e3b36f37da18dc4118" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.760085 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.842249 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82"] Oct 14 13:29:22 crc kubenswrapper[4837]: E1014 13:29:22.842693 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b523b3d5-ba31-4620-8287-055d6bc931cc" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.842710 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b523b3d5-ba31-4620-8287-055d6bc931cc" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.842899 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b523b3d5-ba31-4620-8287-055d6bc931cc" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.843508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.845924 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.846114 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.846261 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.847355 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.855643 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82"] Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.937509 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t886p"] Oct 14 13:29:22 crc kubenswrapper[4837]: W1014 13:29:22.941964 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe08e69_e0ea_451b_a61f_7b62fdb90245.slice/crio-7087fe9a9ad967f4a0b6b2a717189c53e4d21e68c1af6730c4dd3ec9822b7460 WatchSource:0}: Error finding container 7087fe9a9ad967f4a0b6b2a717189c53e4d21e68c1af6730c4dd3ec9822b7460: Status 404 returned error can't find the container with id 7087fe9a9ad967f4a0b6b2a717189c53e4d21e68c1af6730c4dd3ec9822b7460 Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.965810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.966063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzjv\" (UniqueName: \"kubernetes.io/projected/6fa36834-4501-43b2-8084-2c79052f5185-kube-api-access-6fzjv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:22 crc kubenswrapper[4837]: I1014 13:29:22.966149 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.067928 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.068383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzjv\" (UniqueName: \"kubernetes.io/projected/6fa36834-4501-43b2-8084-2c79052f5185-kube-api-access-6fzjv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.068414 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.074799 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.075028 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.091781 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzjv\" (UniqueName: \"kubernetes.io/projected/6fa36834-4501-43b2-8084-2c79052f5185-kube-api-access-6fzjv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qxb82\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.170167 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.672291 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82"] Oct 14 13:29:23 crc kubenswrapper[4837]: W1014 13:29:23.673314 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fa36834_4501_43b2_8084_2c79052f5185.slice/crio-caa3a9a0d77d412d073a4fcc9bbd1cbab410f465efd92791193804c6926e1be9 WatchSource:0}: Error finding container caa3a9a0d77d412d073a4fcc9bbd1cbab410f465efd92791193804c6926e1be9: Status 404 returned error can't find the container with id caa3a9a0d77d412d073a4fcc9bbd1cbab410f465efd92791193804c6926e1be9 Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.771332 4837 generic.go:334] "Generic (PLEG): container finished" podID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerID="ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3" exitCode=0 Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.771399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t886p" event={"ID":"0fe08e69-e0ea-451b-a61f-7b62fdb90245","Type":"ContainerDied","Data":"ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3"} Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.771498 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t886p" event={"ID":"0fe08e69-e0ea-451b-a61f-7b62fdb90245","Type":"ContainerStarted","Data":"7087fe9a9ad967f4a0b6b2a717189c53e4d21e68c1af6730c4dd3ec9822b7460"} Oct 14 13:29:23 crc kubenswrapper[4837]: I1014 13:29:23.773032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" event={"ID":"6fa36834-4501-43b2-8084-2c79052f5185","Type":"ContainerStarted","Data":"caa3a9a0d77d412d073a4fcc9bbd1cbab410f465efd92791193804c6926e1be9"} Oct 14 13:29:24 crc kubenswrapper[4837]: I1014 13:29:24.800579 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" event={"ID":"6fa36834-4501-43b2-8084-2c79052f5185","Type":"ContainerStarted","Data":"9a1c2a816d606442642b182a20e0aa00c542bec70c690727dcf389f5d991d821"} Oct 14 13:29:24 crc kubenswrapper[4837]: I1014 13:29:24.812833 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" podStartSLOduration=2.303950468 podStartE2EDuration="2.812810618s" podCreationTimestamp="2025-10-14 13:29:22 +0000 UTC" firstStartedPulling="2025-10-14 13:29:23.677895258 +0000 UTC m=+1701.594895071" lastFinishedPulling="2025-10-14 13:29:24.186755368 +0000 UTC m=+1702.103755221" observedRunningTime="2025-10-14 13:29:24.809052227 +0000 UTC m=+1702.726052040" watchObservedRunningTime="2025-10-14 13:29:24.812810618 +0000 UTC m=+1702.729810441" Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.043536 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jpbct"] Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.060269 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mmjlq"] Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.068695 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-trsrp"] Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.075189 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mmjlq"] Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.081262 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-trsrp"] Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.087744 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jpbct"] Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.805431 4837 generic.go:334] "Generic (PLEG): container finished" podID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerID="03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84" exitCode=0 Oct 14 13:29:25 crc kubenswrapper[4837]: I1014 13:29:25.805575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t886p" event={"ID":"0fe08e69-e0ea-451b-a61f-7b62fdb90245","Type":"ContainerDied","Data":"03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84"} Oct 14 13:29:26 crc kubenswrapper[4837]: I1014 13:29:26.803802 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3ab4e0-d484-4d59-a19d-c6c3c197542f" path="/var/lib/kubelet/pods/7f3ab4e0-d484-4d59-a19d-c6c3c197542f/volumes" Oct 14 13:29:26 crc kubenswrapper[4837]: I1014 13:29:26.804984 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b359e514-29e9-456b-814d-7e86c9f18e4c" path="/var/lib/kubelet/pods/b359e514-29e9-456b-814d-7e86c9f18e4c/volumes" Oct 14 13:29:26 crc kubenswrapper[4837]: I1014 13:29:26.805536 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb637136-1a3b-4d9c-a991-619e80c8cf31" path="/var/lib/kubelet/pods/cb637136-1a3b-4d9c-a991-619e80c8cf31/volumes" Oct 14 13:29:26 crc kubenswrapper[4837]: I1014 13:29:26.816921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t886p" event={"ID":"0fe08e69-e0ea-451b-a61f-7b62fdb90245","Type":"ContainerStarted","Data":"b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c"} Oct 14 13:29:26 crc kubenswrapper[4837]: I1014 13:29:26.847624 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t886p" podStartSLOduration=2.232919034 podStartE2EDuration="4.847599652s" podCreationTimestamp="2025-10-14 13:29:22 +0000 UTC" firstStartedPulling="2025-10-14 13:29:23.774206745 +0000 UTC m=+1701.691206558" lastFinishedPulling="2025-10-14 13:29:26.388887363 +0000 UTC m=+1704.305887176" observedRunningTime="2025-10-14 13:29:26.841800765 +0000 UTC m=+1704.758800618" watchObservedRunningTime="2025-10-14 13:29:26.847599652 +0000 UTC m=+1704.764599465" Oct 14 13:29:29 crc kubenswrapper[4837]: I1014 13:29:29.854592 4837 generic.go:334] "Generic (PLEG): container finished" podID="6fa36834-4501-43b2-8084-2c79052f5185" containerID="9a1c2a816d606442642b182a20e0aa00c542bec70c690727dcf389f5d991d821" exitCode=0 Oct 14 13:29:29 crc kubenswrapper[4837]: I1014 13:29:29.854678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" event={"ID":"6fa36834-4501-43b2-8084-2c79052f5185","Type":"ContainerDied","Data":"9a1c2a816d606442642b182a20e0aa00c542bec70c690727dcf389f5d991d821"} Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.292411 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.460235 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-ssh-key\") pod \"6fa36834-4501-43b2-8084-2c79052f5185\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.460308 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fzjv\" (UniqueName: \"kubernetes.io/projected/6fa36834-4501-43b2-8084-2c79052f5185-kube-api-access-6fzjv\") pod \"6fa36834-4501-43b2-8084-2c79052f5185\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.460335 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-inventory\") pod \"6fa36834-4501-43b2-8084-2c79052f5185\" (UID: \"6fa36834-4501-43b2-8084-2c79052f5185\") " Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.477148 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa36834-4501-43b2-8084-2c79052f5185-kube-api-access-6fzjv" (OuterVolumeSpecName: "kube-api-access-6fzjv") pod "6fa36834-4501-43b2-8084-2c79052f5185" (UID: "6fa36834-4501-43b2-8084-2c79052f5185"). InnerVolumeSpecName "kube-api-access-6fzjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.509413 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6fa36834-4501-43b2-8084-2c79052f5185" (UID: "6fa36834-4501-43b2-8084-2c79052f5185"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.509507 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-inventory" (OuterVolumeSpecName: "inventory") pod "6fa36834-4501-43b2-8084-2c79052f5185" (UID: "6fa36834-4501-43b2-8084-2c79052f5185"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.562549 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.562588 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fzjv\" (UniqueName: \"kubernetes.io/projected/6fa36834-4501-43b2-8084-2c79052f5185-kube-api-access-6fzjv\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.562603 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa36834-4501-43b2-8084-2c79052f5185-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.878315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" event={"ID":"6fa36834-4501-43b2-8084-2c79052f5185","Type":"ContainerDied","Data":"caa3a9a0d77d412d073a4fcc9bbd1cbab410f465efd92791193804c6926e1be9"} Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.878359 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caa3a9a0d77d412d073a4fcc9bbd1cbab410f465efd92791193804c6926e1be9" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.878375 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qxb82" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.953441 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc"] Oct 14 13:29:31 crc kubenswrapper[4837]: E1014 13:29:31.953836 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa36834-4501-43b2-8084-2c79052f5185" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.953852 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa36834-4501-43b2-8084-2c79052f5185" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.954030 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa36834-4501-43b2-8084-2c79052f5185" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.954639 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.957914 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.957963 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.958720 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.964341 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:29:31 crc kubenswrapper[4837]: I1014 13:29:31.972909 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc"] Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.071240 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.071307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ndr\" (UniqueName: \"kubernetes.io/projected/12398715-a536-446f-81aa-00aa7b0546ed-kube-api-access-57ndr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.071461 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.173866 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.173929 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ndr\" (UniqueName: \"kubernetes.io/projected/12398715-a536-446f-81aa-00aa7b0546ed-kube-api-access-57ndr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.174006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.177204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.185671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.202886 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ndr\" (UniqueName: \"kubernetes.io/projected/12398715-a536-446f-81aa-00aa7b0546ed-kube-api-access-57ndr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hr9tc\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.271495 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.456613 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.456941 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.505438 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.798388 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:29:32 crc kubenswrapper[4837]: E1014 13:29:32.798977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.835403 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc"] Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.886921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" event={"ID":"12398715-a536-446f-81aa-00aa7b0546ed","Type":"ContainerStarted","Data":"85f27bb362e28ccf0ecb8e5e110f1736fcc2a7a0acd4cf7eab1a2501c175f946"} Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.934875 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:32 crc kubenswrapper[4837]: I1014 13:29:32.982284 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t886p"] Oct 14 13:29:33 crc kubenswrapper[4837]: I1014 13:29:33.897266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" event={"ID":"12398715-a536-446f-81aa-00aa7b0546ed","Type":"ContainerStarted","Data":"90de3d1862ca1be15ea943ba98a3474339ab2a1d483c40dd1dfad26eb3335cb5"} Oct 14 13:29:33 crc kubenswrapper[4837]: I1014 13:29:33.926031 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" podStartSLOduration=2.44969097 podStartE2EDuration="2.926009683s" podCreationTimestamp="2025-10-14 13:29:31 +0000 UTC" firstStartedPulling="2025-10-14 13:29:32.847858604 +0000 UTC m=+1710.764858437" lastFinishedPulling="2025-10-14 13:29:33.324177337 +0000 UTC m=+1711.241177150" observedRunningTime="2025-10-14 13:29:33.908983214 +0000 UTC m=+1711.825983067" watchObservedRunningTime="2025-10-14 13:29:33.926009683 +0000 UTC m=+1711.843009506" Oct 14 13:29:34 crc kubenswrapper[4837]: I1014 13:29:34.904807 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t886p" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="registry-server" containerID="cri-o://b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c" gracePeriod=2 Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.061458 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8b25-account-create-x7kg4"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.071380 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-87c3-account-create-dc67f"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.078928 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8b25-account-create-x7kg4"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.085432 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-23f1-account-create-2lvq2"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.092142 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-87c3-account-create-dc67f"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.098708 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-23f1-account-create-2lvq2"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.377868 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.578304 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-utilities\") pod \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.578464 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-catalog-content\") pod \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.578487 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfdtb\" (UniqueName: \"kubernetes.io/projected/0fe08e69-e0ea-451b-a61f-7b62fdb90245-kube-api-access-rfdtb\") pod \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\" (UID: \"0fe08e69-e0ea-451b-a61f-7b62fdb90245\") " Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.579516 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-utilities" (OuterVolumeSpecName: "utilities") pod "0fe08e69-e0ea-451b-a61f-7b62fdb90245" (UID: "0fe08e69-e0ea-451b-a61f-7b62fdb90245"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.580266 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.584974 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe08e69-e0ea-451b-a61f-7b62fdb90245-kube-api-access-rfdtb" (OuterVolumeSpecName: "kube-api-access-rfdtb") pod "0fe08e69-e0ea-451b-a61f-7b62fdb90245" (UID: "0fe08e69-e0ea-451b-a61f-7b62fdb90245"). InnerVolumeSpecName "kube-api-access-rfdtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.668625 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fe08e69-e0ea-451b-a61f-7b62fdb90245" (UID: "0fe08e69-e0ea-451b-a61f-7b62fdb90245"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.681528 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fe08e69-e0ea-451b-a61f-7b62fdb90245-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.681557 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfdtb\" (UniqueName: \"kubernetes.io/projected/0fe08e69-e0ea-451b-a61f-7b62fdb90245-kube-api-access-rfdtb\") on node \"crc\" DevicePath \"\"" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.918082 4837 generic.go:334] "Generic (PLEG): container finished" podID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerID="b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c" exitCode=0 Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.918227 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t886p" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.918251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t886p" event={"ID":"0fe08e69-e0ea-451b-a61f-7b62fdb90245","Type":"ContainerDied","Data":"b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c"} Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.919455 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t886p" event={"ID":"0fe08e69-e0ea-451b-a61f-7b62fdb90245","Type":"ContainerDied","Data":"7087fe9a9ad967f4a0b6b2a717189c53e4d21e68c1af6730c4dd3ec9822b7460"} Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.919535 4837 scope.go:117] "RemoveContainer" containerID="b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.958239 4837 scope.go:117] "RemoveContainer" containerID="03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84" Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.961108 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t886p"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.970657 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t886p"] Oct 14 13:29:35 crc kubenswrapper[4837]: I1014 13:29:35.986450 4837 scope.go:117] "RemoveContainer" containerID="ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.037971 4837 scope.go:117] "RemoveContainer" containerID="b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c" Oct 14 13:29:36 crc kubenswrapper[4837]: E1014 13:29:36.038555 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c\": container with ID starting with b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c not found: ID does not exist" containerID="b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.038613 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c"} err="failed to get container status \"b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c\": rpc error: code = NotFound desc = could not find container \"b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c\": container with ID starting with b9cd8c05114319a1edc1e32ae922c504350b58968166518f70e999a8a68c3c6c not found: ID does not exist" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.038640 4837 scope.go:117] "RemoveContainer" containerID="03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84" Oct 14 13:29:36 crc kubenswrapper[4837]: E1014 13:29:36.039082 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84\": container with ID starting with 03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84 not found: ID does not exist" containerID="03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.039106 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84"} err="failed to get container status \"03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84\": rpc error: code = NotFound desc = could not find container \"03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84\": container with ID starting with 03ac23cc815b5828f584d638aaad803ca97ef6c77c937adfd06cf2824984ab84 not found: ID does not exist" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.039120 4837 scope.go:117] "RemoveContainer" containerID="ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3" Oct 14 13:29:36 crc kubenswrapper[4837]: E1014 13:29:36.039357 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3\": container with ID starting with ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3 not found: ID does not exist" containerID="ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.039377 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3"} err="failed to get container status \"ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3\": rpc error: code = NotFound desc = could not find container \"ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3\": container with ID starting with ea3b91a37778cbdf3cfc5b7a5ac7845d9c337f808d4394477e91dcaf1680cdc3 not found: ID does not exist" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.808393 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" path="/var/lib/kubelet/pods/0fe08e69-e0ea-451b-a61f-7b62fdb90245/volumes" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.809715 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151a9907-f0f3-424b-ab8d-59072c705b8b" path="/var/lib/kubelet/pods/151a9907-f0f3-424b-ab8d-59072c705b8b/volumes" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.810845 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58df43b-87d6-4bf3-ae5f-1eba933a068e" path="/var/lib/kubelet/pods/d58df43b-87d6-4bf3-ae5f-1eba933a068e/volumes" Oct 14 13:29:36 crc kubenswrapper[4837]: I1014 13:29:36.813129 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c11a38-420a-42a5-b9f7-08785e1c1342" path="/var/lib/kubelet/pods/e5c11a38-420a-42a5-b9f7-08785e1c1342/volumes" Oct 14 13:29:44 crc kubenswrapper[4837]: I1014 13:29:44.785608 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:29:44 crc kubenswrapper[4837]: E1014 13:29:44.786726 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:29:58 crc kubenswrapper[4837]: I1014 13:29:58.785218 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:29:58 crc kubenswrapper[4837]: E1014 13:29:58.786079 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:29:59 crc kubenswrapper[4837]: I1014 13:29:59.043538 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zmvhn"] Oct 14 13:29:59 crc kubenswrapper[4837]: I1014 13:29:59.054399 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zmvhn"] Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.157889 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq"] Oct 14 13:30:00 crc kubenswrapper[4837]: E1014 13:30:00.158281 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="extract-utilities" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.158296 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="extract-utilities" Oct 14 13:30:00 crc kubenswrapper[4837]: E1014 13:30:00.158329 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="extract-content" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.158336 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="extract-content" Oct 14 13:30:00 crc kubenswrapper[4837]: E1014 13:30:00.158359 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="registry-server" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.158367 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="registry-server" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.158571 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe08e69-e0ea-451b-a61f-7b62fdb90245" containerName="registry-server" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.159203 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.162039 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.166502 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.176480 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq"] Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.323149 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4f5226d-88de-4248-83a6-d9e660c136a8-secret-volume\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.323565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4f5226d-88de-4248-83a6-d9e660c136a8-config-volume\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.323708 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlt2n\" (UniqueName: \"kubernetes.io/projected/b4f5226d-88de-4248-83a6-d9e660c136a8-kube-api-access-xlt2n\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.424569 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4f5226d-88de-4248-83a6-d9e660c136a8-secret-volume\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.424645 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4f5226d-88de-4248-83a6-d9e660c136a8-config-volume\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.424682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlt2n\" (UniqueName: \"kubernetes.io/projected/b4f5226d-88de-4248-83a6-d9e660c136a8-kube-api-access-xlt2n\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.425696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4f5226d-88de-4248-83a6-d9e660c136a8-config-volume\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.436948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4f5226d-88de-4248-83a6-d9e660c136a8-secret-volume\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.453829 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlt2n\" (UniqueName: \"kubernetes.io/projected/b4f5226d-88de-4248-83a6-d9e660c136a8-kube-api-access-xlt2n\") pod \"collect-profiles-29340810-b8fsq\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.479180 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.796681 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a699c63-862d-41a9-9dbd-5d81978c7985" path="/var/lib/kubelet/pods/0a699c63-862d-41a9-9dbd-5d81978c7985/volumes" Oct 14 13:30:00 crc kubenswrapper[4837]: I1014 13:30:00.927851 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq"] Oct 14 13:30:00 crc kubenswrapper[4837]: W1014 13:30:00.938879 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f5226d_88de_4248_83a6_d9e660c136a8.slice/crio-6ebead7c046d6bfb4b3e20b6fff52f11de52dd562f134fa1c55d54107c53ea56 WatchSource:0}: Error finding container 6ebead7c046d6bfb4b3e20b6fff52f11de52dd562f134fa1c55d54107c53ea56: Status 404 returned error can't find the container with id 6ebead7c046d6bfb4b3e20b6fff52f11de52dd562f134fa1c55d54107c53ea56 Oct 14 13:30:01 crc kubenswrapper[4837]: I1014 13:30:01.180467 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" event={"ID":"b4f5226d-88de-4248-83a6-d9e660c136a8","Type":"ContainerStarted","Data":"7892515923c5eb0003e0e09e85230f08c29dc43700a1586e1fedfb7c9f1e7b04"} Oct 14 13:30:01 crc kubenswrapper[4837]: I1014 13:30:01.180883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" event={"ID":"b4f5226d-88de-4248-83a6-d9e660c136a8","Type":"ContainerStarted","Data":"6ebead7c046d6bfb4b3e20b6fff52f11de52dd562f134fa1c55d54107c53ea56"} Oct 14 13:30:01 crc kubenswrapper[4837]: I1014 13:30:01.204141 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" podStartSLOduration=1.204119262 podStartE2EDuration="1.204119262s" podCreationTimestamp="2025-10-14 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:30:01.195629583 +0000 UTC m=+1739.112629396" watchObservedRunningTime="2025-10-14 13:30:01.204119262 +0000 UTC m=+1739.121119085" Oct 14 13:30:02 crc kubenswrapper[4837]: I1014 13:30:02.192251 4837 generic.go:334] "Generic (PLEG): container finished" podID="b4f5226d-88de-4248-83a6-d9e660c136a8" containerID="7892515923c5eb0003e0e09e85230f08c29dc43700a1586e1fedfb7c9f1e7b04" exitCode=0 Oct 14 13:30:02 crc kubenswrapper[4837]: I1014 13:30:02.192301 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" event={"ID":"b4f5226d-88de-4248-83a6-d9e660c136a8","Type":"ContainerDied","Data":"7892515923c5eb0003e0e09e85230f08c29dc43700a1586e1fedfb7c9f1e7b04"} Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.566618 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.693233 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4f5226d-88de-4248-83a6-d9e660c136a8-secret-volume\") pod \"b4f5226d-88de-4248-83a6-d9e660c136a8\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.693424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4f5226d-88de-4248-83a6-d9e660c136a8-config-volume\") pod \"b4f5226d-88de-4248-83a6-d9e660c136a8\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.693494 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlt2n\" (UniqueName: \"kubernetes.io/projected/b4f5226d-88de-4248-83a6-d9e660c136a8-kube-api-access-xlt2n\") pod \"b4f5226d-88de-4248-83a6-d9e660c136a8\" (UID: \"b4f5226d-88de-4248-83a6-d9e660c136a8\") " Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.693877 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4f5226d-88de-4248-83a6-d9e660c136a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "b4f5226d-88de-4248-83a6-d9e660c136a8" (UID: "b4f5226d-88de-4248-83a6-d9e660c136a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.694332 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4f5226d-88de-4248-83a6-d9e660c136a8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.698505 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f5226d-88de-4248-83a6-d9e660c136a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b4f5226d-88de-4248-83a6-d9e660c136a8" (UID: "b4f5226d-88de-4248-83a6-d9e660c136a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.701468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f5226d-88de-4248-83a6-d9e660c136a8-kube-api-access-xlt2n" (OuterVolumeSpecName: "kube-api-access-xlt2n") pod "b4f5226d-88de-4248-83a6-d9e660c136a8" (UID: "b4f5226d-88de-4248-83a6-d9e660c136a8"). InnerVolumeSpecName "kube-api-access-xlt2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.795545 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4f5226d-88de-4248-83a6-d9e660c136a8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:30:03 crc kubenswrapper[4837]: I1014 13:30:03.795579 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlt2n\" (UniqueName: \"kubernetes.io/projected/b4f5226d-88de-4248-83a6-d9e660c136a8-kube-api-access-xlt2n\") on node \"crc\" DevicePath \"\"" Oct 14 13:30:04 crc kubenswrapper[4837]: I1014 13:30:04.215094 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" event={"ID":"b4f5226d-88de-4248-83a6-d9e660c136a8","Type":"ContainerDied","Data":"6ebead7c046d6bfb4b3e20b6fff52f11de52dd562f134fa1c55d54107c53ea56"} Oct 14 13:30:04 crc kubenswrapper[4837]: I1014 13:30:04.215602 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ebead7c046d6bfb4b3e20b6fff52f11de52dd562f134fa1c55d54107c53ea56" Oct 14 13:30:04 crc kubenswrapper[4837]: I1014 13:30:04.215383 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340810-b8fsq" Oct 14 13:30:06 crc kubenswrapper[4837]: I1014 13:30:06.722813 4837 scope.go:117] "RemoveContainer" containerID="3a4a1c4646135f719cc83377c2dd7ecb91cce3e4dd405c8c0e7399f6a8e1c0f0" Oct 14 13:30:06 crc kubenswrapper[4837]: I1014 13:30:06.766369 4837 scope.go:117] "RemoveContainer" containerID="452e3e4995effccd4e1e14fb9ef73c5dfe426ecdbb529c37281ebb0b866b2f21" Oct 14 13:30:06 crc kubenswrapper[4837]: I1014 13:30:06.841741 4837 scope.go:117] "RemoveContainer" containerID="424ae7873ca431a6b35de5724c85298f676e413955818457d6e8ce3dfdde5dec" Oct 14 13:30:06 crc kubenswrapper[4837]: I1014 13:30:06.885632 4837 scope.go:117] "RemoveContainer" containerID="8d8fb652e7baf9b186ba9013bb794f0a2da8988ba35ab26ebb63b8dce919f5d3" Oct 14 13:30:06 crc kubenswrapper[4837]: I1014 13:30:06.915540 4837 scope.go:117] "RemoveContainer" containerID="f2d7760bc7810e242f6f5676cb97d1341e34ac15e9d05d294bc7811f77a52ad2" Oct 14 13:30:06 crc kubenswrapper[4837]: I1014 13:30:06.956270 4837 scope.go:117] "RemoveContainer" containerID="cd138eec01a6fc44ee03eecdf3f58ae0230093cfa3a792114672df5342df5a10" Oct 14 13:30:07 crc kubenswrapper[4837]: I1014 13:30:07.019297 4837 scope.go:117] "RemoveContainer" containerID="4fe29d7164af95d4644a047c4ca450031224e03d56c6336aac62af6caba68c80" Oct 14 13:30:12 crc kubenswrapper[4837]: I1014 13:30:12.299723 4837 generic.go:334] "Generic (PLEG): container finished" podID="12398715-a536-446f-81aa-00aa7b0546ed" containerID="90de3d1862ca1be15ea943ba98a3474339ab2a1d483c40dd1dfad26eb3335cb5" exitCode=0 Oct 14 13:30:12 crc kubenswrapper[4837]: I1014 13:30:12.299859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" event={"ID":"12398715-a536-446f-81aa-00aa7b0546ed","Type":"ContainerDied","Data":"90de3d1862ca1be15ea943ba98a3474339ab2a1d483c40dd1dfad26eb3335cb5"} Oct 14 13:30:13 crc kubenswrapper[4837]: I1014 13:30:13.784596 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:30:13 crc kubenswrapper[4837]: E1014 13:30:13.785176 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:30:13 crc kubenswrapper[4837]: I1014 13:30:13.814043 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.002354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-inventory\") pod \"12398715-a536-446f-81aa-00aa7b0546ed\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.003194 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57ndr\" (UniqueName: \"kubernetes.io/projected/12398715-a536-446f-81aa-00aa7b0546ed-kube-api-access-57ndr\") pod \"12398715-a536-446f-81aa-00aa7b0546ed\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.003236 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-ssh-key\") pod \"12398715-a536-446f-81aa-00aa7b0546ed\" (UID: \"12398715-a536-446f-81aa-00aa7b0546ed\") " Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.007469 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12398715-a536-446f-81aa-00aa7b0546ed-kube-api-access-57ndr" (OuterVolumeSpecName: "kube-api-access-57ndr") pod "12398715-a536-446f-81aa-00aa7b0546ed" (UID: "12398715-a536-446f-81aa-00aa7b0546ed"). InnerVolumeSpecName "kube-api-access-57ndr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.027751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "12398715-a536-446f-81aa-00aa7b0546ed" (UID: "12398715-a536-446f-81aa-00aa7b0546ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.040325 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-inventory" (OuterVolumeSpecName: "inventory") pod "12398715-a536-446f-81aa-00aa7b0546ed" (UID: "12398715-a536-446f-81aa-00aa7b0546ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.105773 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57ndr\" (UniqueName: \"kubernetes.io/projected/12398715-a536-446f-81aa-00aa7b0546ed-kube-api-access-57ndr\") on node \"crc\" DevicePath \"\"" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.105818 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.105828 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12398715-a536-446f-81aa-00aa7b0546ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.322039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" event={"ID":"12398715-a536-446f-81aa-00aa7b0546ed","Type":"ContainerDied","Data":"85f27bb362e28ccf0ecb8e5e110f1736fcc2a7a0acd4cf7eab1a2501c175f946"} Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.322099 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f27bb362e28ccf0ecb8e5e110f1736fcc2a7a0acd4cf7eab1a2501c175f946" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.322206 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hr9tc" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.392312 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl"] Oct 14 13:30:14 crc kubenswrapper[4837]: E1014 13:30:14.393028 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f5226d-88de-4248-83a6-d9e660c136a8" containerName="collect-profiles" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.393046 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f5226d-88de-4248-83a6-d9e660c136a8" containerName="collect-profiles" Oct 14 13:30:14 crc kubenswrapper[4837]: E1014 13:30:14.393060 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12398715-a536-446f-81aa-00aa7b0546ed" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.393070 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="12398715-a536-446f-81aa-00aa7b0546ed" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.393319 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f5226d-88de-4248-83a6-d9e660c136a8" containerName="collect-profiles" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.393365 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="12398715-a536-446f-81aa-00aa7b0546ed" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.394109 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.395729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.396270 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.396301 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.400342 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.403483 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl"] Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.513490 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zrwb\" (UniqueName: \"kubernetes.io/projected/1febddb2-b222-433e-b8bc-47a3956bc38d-kube-api-access-5zrwb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.513607 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.513676 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.615097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.615252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zrwb\" (UniqueName: \"kubernetes.io/projected/1febddb2-b222-433e-b8bc-47a3956bc38d-kube-api-access-5zrwb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.615311 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.619567 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.619704 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.634024 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zrwb\" (UniqueName: \"kubernetes.io/projected/1febddb2-b222-433e-b8bc-47a3956bc38d-kube-api-access-5zrwb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9npl\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:14 crc kubenswrapper[4837]: I1014 13:30:14.715959 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:30:15 crc kubenswrapper[4837]: I1014 13:30:15.298364 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl"] Oct 14 13:30:15 crc kubenswrapper[4837]: I1014 13:30:15.332389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" event={"ID":"1febddb2-b222-433e-b8bc-47a3956bc38d","Type":"ContainerStarted","Data":"4d295c5cdcdf587540ab1528584472ff84326cbdd987dd5bfe87977d995e505d"} Oct 14 13:30:16 crc kubenswrapper[4837]: I1014 13:30:16.341812 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" event={"ID":"1febddb2-b222-433e-b8bc-47a3956bc38d","Type":"ContainerStarted","Data":"dc4c64b365d4a7b3383392fa8cecea514b488ca12081b1eeca728f7405c1575e"} Oct 14 13:30:16 crc kubenswrapper[4837]: I1014 13:30:16.369468 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" podStartSLOduration=1.945920887 podStartE2EDuration="2.36945112s" podCreationTimestamp="2025-10-14 13:30:14 +0000 UTC" firstStartedPulling="2025-10-14 13:30:15.30332568 +0000 UTC m=+1753.220325493" lastFinishedPulling="2025-10-14 13:30:15.726855863 +0000 UTC m=+1753.643855726" observedRunningTime="2025-10-14 13:30:16.363273444 +0000 UTC m=+1754.280273257" watchObservedRunningTime="2025-10-14 13:30:16.36945112 +0000 UTC m=+1754.286450933" Oct 14 13:30:22 crc kubenswrapper[4837]: I1014 13:30:22.052805 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-k4v9z"] Oct 14 13:30:22 crc kubenswrapper[4837]: I1014 13:30:22.060695 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-k4v9z"] Oct 14 13:30:22 crc kubenswrapper[4837]: I1014 13:30:22.807071 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6738cf5-e029-4393-9dcb-818a2e5ed0b3" path="/var/lib/kubelet/pods/a6738cf5-e029-4393-9dcb-818a2e5ed0b3/volumes" Oct 14 13:30:27 crc kubenswrapper[4837]: I1014 13:30:27.785295 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:30:27 crc kubenswrapper[4837]: E1014 13:30:27.786138 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:30:30 crc kubenswrapper[4837]: I1014 13:30:30.063128 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r9shz"] Oct 14 13:30:30 crc kubenswrapper[4837]: I1014 13:30:30.073700 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r9shz"] Oct 14 13:30:30 crc kubenswrapper[4837]: I1014 13:30:30.804859 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc381d25-0a5e-438b-b2c8-45edc80148f4" path="/var/lib/kubelet/pods/bc381d25-0a5e-438b-b2c8-45edc80148f4/volumes" Oct 14 13:30:38 crc kubenswrapper[4837]: I1014 13:30:38.784715 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:30:38 crc kubenswrapper[4837]: E1014 13:30:38.785647 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:30:50 crc kubenswrapper[4837]: I1014 13:30:50.785616 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:30:50 crc kubenswrapper[4837]: E1014 13:30:50.786917 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:31:05 crc kubenswrapper[4837]: I1014 13:31:05.785484 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:31:05 crc kubenswrapper[4837]: E1014 13:31:05.786342 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:31:07 crc kubenswrapper[4837]: I1014 13:31:07.045783 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5hkhc"] Oct 14 13:31:07 crc kubenswrapper[4837]: I1014 13:31:07.054004 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5hkhc"] Oct 14 13:31:07 crc kubenswrapper[4837]: I1014 13:31:07.188548 4837 scope.go:117] "RemoveContainer" containerID="443ff350c29b3303e59363224be1b5cd139d697736cb9756bc30bff5a1b8f62c" Oct 14 13:31:07 crc kubenswrapper[4837]: I1014 13:31:07.231298 4837 scope.go:117] "RemoveContainer" containerID="7ad0b1260b4f58255dd5eb02e380782519fc7ecb1c45ac008e0413662ceb9cf7" Oct 14 13:31:08 crc kubenswrapper[4837]: I1014 13:31:08.793616 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400438cc-a9d6-4683-8eeb-df32774fc5a5" path="/var/lib/kubelet/pods/400438cc-a9d6-4683-8eeb-df32774fc5a5/volumes" Oct 14 13:31:13 crc kubenswrapper[4837]: I1014 13:31:13.943696 4837 generic.go:334] "Generic (PLEG): container finished" podID="1febddb2-b222-433e-b8bc-47a3956bc38d" containerID="dc4c64b365d4a7b3383392fa8cecea514b488ca12081b1eeca728f7405c1575e" exitCode=2 Oct 14 13:31:13 crc kubenswrapper[4837]: I1014 13:31:13.943870 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" event={"ID":"1febddb2-b222-433e-b8bc-47a3956bc38d","Type":"ContainerDied","Data":"dc4c64b365d4a7b3383392fa8cecea514b488ca12081b1eeca728f7405c1575e"} Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.370531 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.475210 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-inventory\") pod \"1febddb2-b222-433e-b8bc-47a3956bc38d\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.475267 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zrwb\" (UniqueName: \"kubernetes.io/projected/1febddb2-b222-433e-b8bc-47a3956bc38d-kube-api-access-5zrwb\") pod \"1febddb2-b222-433e-b8bc-47a3956bc38d\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.475296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-ssh-key\") pod \"1febddb2-b222-433e-b8bc-47a3956bc38d\" (UID: \"1febddb2-b222-433e-b8bc-47a3956bc38d\") " Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.480630 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1febddb2-b222-433e-b8bc-47a3956bc38d-kube-api-access-5zrwb" (OuterVolumeSpecName: "kube-api-access-5zrwb") pod "1febddb2-b222-433e-b8bc-47a3956bc38d" (UID: "1febddb2-b222-433e-b8bc-47a3956bc38d"). InnerVolumeSpecName "kube-api-access-5zrwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.500668 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-inventory" (OuterVolumeSpecName: "inventory") pod "1febddb2-b222-433e-b8bc-47a3956bc38d" (UID: "1febddb2-b222-433e-b8bc-47a3956bc38d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.505667 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1febddb2-b222-433e-b8bc-47a3956bc38d" (UID: "1febddb2-b222-433e-b8bc-47a3956bc38d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.577856 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zrwb\" (UniqueName: \"kubernetes.io/projected/1febddb2-b222-433e-b8bc-47a3956bc38d-kube-api-access-5zrwb\") on node \"crc\" DevicePath \"\"" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.577898 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.577912 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1febddb2-b222-433e-b8bc-47a3956bc38d-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.962868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" event={"ID":"1febddb2-b222-433e-b8bc-47a3956bc38d","Type":"ContainerDied","Data":"4d295c5cdcdf587540ab1528584472ff84326cbdd987dd5bfe87977d995e505d"} Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.963228 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d295c5cdcdf587540ab1528584472ff84326cbdd987dd5bfe87977d995e505d" Oct 14 13:31:15 crc kubenswrapper[4837]: I1014 13:31:15.962925 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9npl" Oct 14 13:31:18 crc kubenswrapper[4837]: I1014 13:31:18.784676 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:31:20 crc kubenswrapper[4837]: I1014 13:31:20.008788 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"b628675282f277238c95d5d491e72ef951faf707ba5860d9bb8d286e48707d31"} Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.032956 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd"] Oct 14 13:31:23 crc kubenswrapper[4837]: E1014 13:31:23.033986 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1febddb2-b222-433e-b8bc-47a3956bc38d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.034002 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1febddb2-b222-433e-b8bc-47a3956bc38d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.038490 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1febddb2-b222-433e-b8bc-47a3956bc38d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.039452 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.044873 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.045236 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.045354 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.045380 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.057866 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd"] Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.128935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7pnv\" (UniqueName: \"kubernetes.io/projected/b030d75a-71e0-41af-9ab0-298924d1a955-kube-api-access-k7pnv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.128988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.129101 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.231108 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7pnv\" (UniqueName: \"kubernetes.io/projected/b030d75a-71e0-41af-9ab0-298924d1a955-kube-api-access-k7pnv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.231222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.231260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.240057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.240152 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.271329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7pnv\" (UniqueName: \"kubernetes.io/projected/b030d75a-71e0-41af-9ab0-298924d1a955-kube-api-access-k7pnv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.379875 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.975722 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd"] Oct 14 13:31:23 crc kubenswrapper[4837]: I1014 13:31:23.984689 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:31:24 crc kubenswrapper[4837]: I1014 13:31:24.060515 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" event={"ID":"b030d75a-71e0-41af-9ab0-298924d1a955","Type":"ContainerStarted","Data":"947d63270d3eac4e73053ad9bedcb075763745832fa1d22f7934a44a86a4fa76"} Oct 14 13:31:25 crc kubenswrapper[4837]: I1014 13:31:25.072456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" event={"ID":"b030d75a-71e0-41af-9ab0-298924d1a955","Type":"ContainerStarted","Data":"68beb28bdb4b4f37ce0a79b58b83c1ce6297f0b58454820dec47e0bc1ec169d2"} Oct 14 13:31:25 crc kubenswrapper[4837]: I1014 13:31:25.096730 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" podStartSLOduration=1.607506689 podStartE2EDuration="2.096713014s" podCreationTimestamp="2025-10-14 13:31:23 +0000 UTC" firstStartedPulling="2025-10-14 13:31:23.984483639 +0000 UTC m=+1821.901483452" lastFinishedPulling="2025-10-14 13:31:24.473689914 +0000 UTC m=+1822.390689777" observedRunningTime="2025-10-14 13:31:25.092624853 +0000 UTC m=+1823.009624686" watchObservedRunningTime="2025-10-14 13:31:25.096713014 +0000 UTC m=+1823.013712817" Oct 14 13:32:07 crc kubenswrapper[4837]: I1014 13:32:07.330799 4837 scope.go:117] "RemoveContainer" containerID="6ca09ff19002cf33eee6fe24b5ee12596b16e0baba3200cea57a856f2e0becc1" Oct 14 13:32:16 crc kubenswrapper[4837]: I1014 13:32:16.624708 4837 generic.go:334] "Generic (PLEG): container finished" podID="b030d75a-71e0-41af-9ab0-298924d1a955" containerID="68beb28bdb4b4f37ce0a79b58b83c1ce6297f0b58454820dec47e0bc1ec169d2" exitCode=0 Oct 14 13:32:16 crc kubenswrapper[4837]: I1014 13:32:16.624963 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" event={"ID":"b030d75a-71e0-41af-9ab0-298924d1a955","Type":"ContainerDied","Data":"68beb28bdb4b4f37ce0a79b58b83c1ce6297f0b58454820dec47e0bc1ec169d2"} Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.043310 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.192121 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-inventory\") pod \"b030d75a-71e0-41af-9ab0-298924d1a955\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.192312 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7pnv\" (UniqueName: \"kubernetes.io/projected/b030d75a-71e0-41af-9ab0-298924d1a955-kube-api-access-k7pnv\") pod \"b030d75a-71e0-41af-9ab0-298924d1a955\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.192390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-ssh-key\") pod \"b030d75a-71e0-41af-9ab0-298924d1a955\" (UID: \"b030d75a-71e0-41af-9ab0-298924d1a955\") " Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.197845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b030d75a-71e0-41af-9ab0-298924d1a955-kube-api-access-k7pnv" (OuterVolumeSpecName: "kube-api-access-k7pnv") pod "b030d75a-71e0-41af-9ab0-298924d1a955" (UID: "b030d75a-71e0-41af-9ab0-298924d1a955"). InnerVolumeSpecName "kube-api-access-k7pnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.218733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b030d75a-71e0-41af-9ab0-298924d1a955" (UID: "b030d75a-71e0-41af-9ab0-298924d1a955"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.219881 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-inventory" (OuterVolumeSpecName: "inventory") pod "b030d75a-71e0-41af-9ab0-298924d1a955" (UID: "b030d75a-71e0-41af-9ab0-298924d1a955"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.294780 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7pnv\" (UniqueName: \"kubernetes.io/projected/b030d75a-71e0-41af-9ab0-298924d1a955-kube-api-access-k7pnv\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.294823 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.294838 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b030d75a-71e0-41af-9ab0-298924d1a955-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.646148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" event={"ID":"b030d75a-71e0-41af-9ab0-298924d1a955","Type":"ContainerDied","Data":"947d63270d3eac4e73053ad9bedcb075763745832fa1d22f7934a44a86a4fa76"} Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.646214 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="947d63270d3eac4e73053ad9bedcb075763745832fa1d22f7934a44a86a4fa76" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.646275 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.745315 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m8c7v"] Oct 14 13:32:18 crc kubenswrapper[4837]: E1014 13:32:18.745691 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b030d75a-71e0-41af-9ab0-298924d1a955" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.745707 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b030d75a-71e0-41af-9ab0-298924d1a955" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.745909 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b030d75a-71e0-41af-9ab0-298924d1a955" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.746740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.751552 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.756601 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.756854 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m8c7v"] Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.760507 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.761043 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.905492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.905551 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:18 crc kubenswrapper[4837]: I1014 13:32:18.905713 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lfjv\" (UniqueName: \"kubernetes.io/projected/805056c6-9ce3-4dcf-852d-2a71b8627f80-kube-api-access-7lfjv\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.007415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lfjv\" (UniqueName: \"kubernetes.io/projected/805056c6-9ce3-4dcf-852d-2a71b8627f80-kube-api-access-7lfjv\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.007518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.007549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.016251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.020471 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.025140 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lfjv\" (UniqueName: \"kubernetes.io/projected/805056c6-9ce3-4dcf-852d-2a71b8627f80-kube-api-access-7lfjv\") pod \"ssh-known-hosts-edpm-deployment-m8c7v\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.065254 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.594770 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m8c7v"] Oct 14 13:32:19 crc kubenswrapper[4837]: I1014 13:32:19.655697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" event={"ID":"805056c6-9ce3-4dcf-852d-2a71b8627f80","Type":"ContainerStarted","Data":"aacaf11d5315a80e87d5a20a4afaa4eb3c07dee567fb429a6f8604dc88c49467"} Oct 14 13:32:21 crc kubenswrapper[4837]: I1014 13:32:21.681516 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" event={"ID":"805056c6-9ce3-4dcf-852d-2a71b8627f80","Type":"ContainerStarted","Data":"19587880da38ab37c9aaa46a0afd0477dabd66f8ed6de9788abe5ed3d8e3e0e0"} Oct 14 13:32:21 crc kubenswrapper[4837]: I1014 13:32:21.716255 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" podStartSLOduration=2.656496685 podStartE2EDuration="3.716238453s" podCreationTimestamp="2025-10-14 13:32:18 +0000 UTC" firstStartedPulling="2025-10-14 13:32:19.602120982 +0000 UTC m=+1877.519120795" lastFinishedPulling="2025-10-14 13:32:20.66186275 +0000 UTC m=+1878.578862563" observedRunningTime="2025-10-14 13:32:21.702867842 +0000 UTC m=+1879.619867695" watchObservedRunningTime="2025-10-14 13:32:21.716238453 +0000 UTC m=+1879.633238266" Oct 14 13:32:28 crc kubenswrapper[4837]: I1014 13:32:28.767331 4837 generic.go:334] "Generic (PLEG): container finished" podID="805056c6-9ce3-4dcf-852d-2a71b8627f80" containerID="19587880da38ab37c9aaa46a0afd0477dabd66f8ed6de9788abe5ed3d8e3e0e0" exitCode=0 Oct 14 13:32:28 crc kubenswrapper[4837]: I1014 13:32:28.767365 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" event={"ID":"805056c6-9ce3-4dcf-852d-2a71b8627f80","Type":"ContainerDied","Data":"19587880da38ab37c9aaa46a0afd0477dabd66f8ed6de9788abe5ed3d8e3e0e0"} Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.193395 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.334296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-inventory-0\") pod \"805056c6-9ce3-4dcf-852d-2a71b8627f80\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.334352 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-ssh-key-openstack-edpm-ipam\") pod \"805056c6-9ce3-4dcf-852d-2a71b8627f80\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.334412 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lfjv\" (UniqueName: \"kubernetes.io/projected/805056c6-9ce3-4dcf-852d-2a71b8627f80-kube-api-access-7lfjv\") pod \"805056c6-9ce3-4dcf-852d-2a71b8627f80\" (UID: \"805056c6-9ce3-4dcf-852d-2a71b8627f80\") " Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.341009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805056c6-9ce3-4dcf-852d-2a71b8627f80-kube-api-access-7lfjv" (OuterVolumeSpecName: "kube-api-access-7lfjv") pod "805056c6-9ce3-4dcf-852d-2a71b8627f80" (UID: "805056c6-9ce3-4dcf-852d-2a71b8627f80"). InnerVolumeSpecName "kube-api-access-7lfjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.383781 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "805056c6-9ce3-4dcf-852d-2a71b8627f80" (UID: "805056c6-9ce3-4dcf-852d-2a71b8627f80"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.383797 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "805056c6-9ce3-4dcf-852d-2a71b8627f80" (UID: "805056c6-9ce3-4dcf-852d-2a71b8627f80"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.437403 4837 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.437466 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/805056c6-9ce3-4dcf-852d-2a71b8627f80-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.437490 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lfjv\" (UniqueName: \"kubernetes.io/projected/805056c6-9ce3-4dcf-852d-2a71b8627f80-kube-api-access-7lfjv\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.792304 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.796423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m8c7v" event={"ID":"805056c6-9ce3-4dcf-852d-2a71b8627f80","Type":"ContainerDied","Data":"aacaf11d5315a80e87d5a20a4afaa4eb3c07dee567fb429a6f8604dc88c49467"} Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.796471 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aacaf11d5315a80e87d5a20a4afaa4eb3c07dee567fb429a6f8604dc88c49467" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.875478 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt"] Oct 14 13:32:30 crc kubenswrapper[4837]: E1014 13:32:30.876086 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805056c6-9ce3-4dcf-852d-2a71b8627f80" containerName="ssh-known-hosts-edpm-deployment" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.876121 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="805056c6-9ce3-4dcf-852d-2a71b8627f80" containerName="ssh-known-hosts-edpm-deployment" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.876542 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="805056c6-9ce3-4dcf-852d-2a71b8627f80" containerName="ssh-known-hosts-edpm-deployment" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.877619 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.882413 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.882485 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.882690 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.882878 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.902084 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt"] Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.953546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.953919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2s9\" (UniqueName: \"kubernetes.io/projected/ddd3587a-7e10-4ad8-90bf-c172acc6e635-kube-api-access-4t2s9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:30 crc kubenswrapper[4837]: I1014 13:32:30.953972 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.055422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.055511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2s9\" (UniqueName: \"kubernetes.io/projected/ddd3587a-7e10-4ad8-90bf-c172acc6e635-kube-api-access-4t2s9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.055547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.059538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.059622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.079440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2s9\" (UniqueName: \"kubernetes.io/projected/ddd3587a-7e10-4ad8-90bf-c172acc6e635-kube-api-access-4t2s9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2nnnt\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.198946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.746984 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt"] Oct 14 13:32:31 crc kubenswrapper[4837]: I1014 13:32:31.807405 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" event={"ID":"ddd3587a-7e10-4ad8-90bf-c172acc6e635","Type":"ContainerStarted","Data":"74aa6c9d986ab82d5c986e4d1ecd402a1cfceb38054406033d5a0f2d0d3ab6fa"} Oct 14 13:32:32 crc kubenswrapper[4837]: I1014 13:32:32.824143 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" event={"ID":"ddd3587a-7e10-4ad8-90bf-c172acc6e635","Type":"ContainerStarted","Data":"5887599a46fa004a9b4977d66fe409181a11da776e0b8239e075bf784ca5dbdd"} Oct 14 13:32:32 crc kubenswrapper[4837]: I1014 13:32:32.854616 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" podStartSLOduration=2.379444279 podStartE2EDuration="2.854590376s" podCreationTimestamp="2025-10-14 13:32:30 +0000 UTC" firstStartedPulling="2025-10-14 13:32:31.765079945 +0000 UTC m=+1889.682079758" lastFinishedPulling="2025-10-14 13:32:32.240226042 +0000 UTC m=+1890.157225855" observedRunningTime="2025-10-14 13:32:32.845271025 +0000 UTC m=+1890.762270878" watchObservedRunningTime="2025-10-14 13:32:32.854590376 +0000 UTC m=+1890.771590199" Oct 14 13:32:41 crc kubenswrapper[4837]: I1014 13:32:41.917176 4837 generic.go:334] "Generic (PLEG): container finished" podID="ddd3587a-7e10-4ad8-90bf-c172acc6e635" containerID="5887599a46fa004a9b4977d66fe409181a11da776e0b8239e075bf784ca5dbdd" exitCode=0 Oct 14 13:32:41 crc kubenswrapper[4837]: I1014 13:32:41.917273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" event={"ID":"ddd3587a-7e10-4ad8-90bf-c172acc6e635","Type":"ContainerDied","Data":"5887599a46fa004a9b4977d66fe409181a11da776e0b8239e075bf784ca5dbdd"} Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.418222 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.504623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-ssh-key\") pod \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.504824 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2s9\" (UniqueName: \"kubernetes.io/projected/ddd3587a-7e10-4ad8-90bf-c172acc6e635-kube-api-access-4t2s9\") pod \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.504966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-inventory\") pod \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\" (UID: \"ddd3587a-7e10-4ad8-90bf-c172acc6e635\") " Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.510389 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd3587a-7e10-4ad8-90bf-c172acc6e635-kube-api-access-4t2s9" (OuterVolumeSpecName: "kube-api-access-4t2s9") pod "ddd3587a-7e10-4ad8-90bf-c172acc6e635" (UID: "ddd3587a-7e10-4ad8-90bf-c172acc6e635"). InnerVolumeSpecName "kube-api-access-4t2s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.537656 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-inventory" (OuterVolumeSpecName: "inventory") pod "ddd3587a-7e10-4ad8-90bf-c172acc6e635" (UID: "ddd3587a-7e10-4ad8-90bf-c172acc6e635"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.539556 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ddd3587a-7e10-4ad8-90bf-c172acc6e635" (UID: "ddd3587a-7e10-4ad8-90bf-c172acc6e635"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.607514 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.607584 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd3587a-7e10-4ad8-90bf-c172acc6e635-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.607601 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2s9\" (UniqueName: \"kubernetes.io/projected/ddd3587a-7e10-4ad8-90bf-c172acc6e635-kube-api-access-4t2s9\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.939330 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" event={"ID":"ddd3587a-7e10-4ad8-90bf-c172acc6e635","Type":"ContainerDied","Data":"74aa6c9d986ab82d5c986e4d1ecd402a1cfceb38054406033d5a0f2d0d3ab6fa"} Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.939373 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74aa6c9d986ab82d5c986e4d1ecd402a1cfceb38054406033d5a0f2d0d3ab6fa" Oct 14 13:32:43 crc kubenswrapper[4837]: I1014 13:32:43.939395 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2nnnt" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.008050 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj"] Oct 14 13:32:44 crc kubenswrapper[4837]: E1014 13:32:44.008441 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd3587a-7e10-4ad8-90bf-c172acc6e635" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.008463 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd3587a-7e10-4ad8-90bf-c172acc6e635" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.008642 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd3587a-7e10-4ad8-90bf-c172acc6e635" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.009230 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.014817 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.016445 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.016514 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.017285 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.035678 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj"] Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.116682 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/611b04f3-d9fa-4841-8cd5-608c99279890-kube-api-access-zlvzx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.117117 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.117350 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.219117 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.219276 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.219349 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/611b04f3-d9fa-4841-8cd5-608c99279890-kube-api-access-zlvzx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.223831 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.225399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.240254 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/611b04f3-d9fa-4841-8cd5-608c99279890-kube-api-access-zlvzx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.330971 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:44 crc kubenswrapper[4837]: W1014 13:32:44.876509 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod611b04f3_d9fa_4841_8cd5_608c99279890.slice/crio-ca1413e1cca8648612e3b296a84bc6681eea0fb5d563e43e405d381aaef3e455 WatchSource:0}: Error finding container ca1413e1cca8648612e3b296a84bc6681eea0fb5d563e43e405d381aaef3e455: Status 404 returned error can't find the container with id ca1413e1cca8648612e3b296a84bc6681eea0fb5d563e43e405d381aaef3e455 Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.878851 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj"] Oct 14 13:32:44 crc kubenswrapper[4837]: I1014 13:32:44.951892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" event={"ID":"611b04f3-d9fa-4841-8cd5-608c99279890","Type":"ContainerStarted","Data":"ca1413e1cca8648612e3b296a84bc6681eea0fb5d563e43e405d381aaef3e455"} Oct 14 13:32:45 crc kubenswrapper[4837]: I1014 13:32:45.961494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" event={"ID":"611b04f3-d9fa-4841-8cd5-608c99279890","Type":"ContainerStarted","Data":"1bd242a828e8566d33f7a84b2ca25e4b86d80a5c9eaa3ebb8c26136c386ca9b6"} Oct 14 13:32:45 crc kubenswrapper[4837]: I1014 13:32:45.984225 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" podStartSLOduration=2.551686069 podStartE2EDuration="2.984202874s" podCreationTimestamp="2025-10-14 13:32:43 +0000 UTC" firstStartedPulling="2025-10-14 13:32:44.880228312 +0000 UTC m=+1902.797228125" lastFinishedPulling="2025-10-14 13:32:45.312745107 +0000 UTC m=+1903.229744930" observedRunningTime="2025-10-14 13:32:45.97965029 +0000 UTC m=+1903.896650113" watchObservedRunningTime="2025-10-14 13:32:45.984202874 +0000 UTC m=+1903.901202687" Oct 14 13:32:56 crc kubenswrapper[4837]: I1014 13:32:56.092464 4837 generic.go:334] "Generic (PLEG): container finished" podID="611b04f3-d9fa-4841-8cd5-608c99279890" containerID="1bd242a828e8566d33f7a84b2ca25e4b86d80a5c9eaa3ebb8c26136c386ca9b6" exitCode=0 Oct 14 13:32:56 crc kubenswrapper[4837]: I1014 13:32:56.092557 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" event={"ID":"611b04f3-d9fa-4841-8cd5-608c99279890","Type":"ContainerDied","Data":"1bd242a828e8566d33f7a84b2ca25e4b86d80a5c9eaa3ebb8c26136c386ca9b6"} Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.519469 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.591745 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-inventory\") pod \"611b04f3-d9fa-4841-8cd5-608c99279890\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.591875 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-ssh-key\") pod \"611b04f3-d9fa-4841-8cd5-608c99279890\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.591911 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/611b04f3-d9fa-4841-8cd5-608c99279890-kube-api-access-zlvzx\") pod \"611b04f3-d9fa-4841-8cd5-608c99279890\" (UID: \"611b04f3-d9fa-4841-8cd5-608c99279890\") " Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.597676 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611b04f3-d9fa-4841-8cd5-608c99279890-kube-api-access-zlvzx" (OuterVolumeSpecName: "kube-api-access-zlvzx") pod "611b04f3-d9fa-4841-8cd5-608c99279890" (UID: "611b04f3-d9fa-4841-8cd5-608c99279890"). InnerVolumeSpecName "kube-api-access-zlvzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.618425 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-inventory" (OuterVolumeSpecName: "inventory") pod "611b04f3-d9fa-4841-8cd5-608c99279890" (UID: "611b04f3-d9fa-4841-8cd5-608c99279890"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.628449 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "611b04f3-d9fa-4841-8cd5-608c99279890" (UID: "611b04f3-d9fa-4841-8cd5-608c99279890"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.694555 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.694602 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611b04f3-d9fa-4841-8cd5-608c99279890-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:57 crc kubenswrapper[4837]: I1014 13:32:57.694613 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/611b04f3-d9fa-4841-8cd5-608c99279890-kube-api-access-zlvzx\") on node \"crc\" DevicePath \"\"" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.108904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" event={"ID":"611b04f3-d9fa-4841-8cd5-608c99279890","Type":"ContainerDied","Data":"ca1413e1cca8648612e3b296a84bc6681eea0fb5d563e43e405d381aaef3e455"} Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.108966 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.108976 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca1413e1cca8648612e3b296a84bc6681eea0fb5d563e43e405d381aaef3e455" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.206019 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd"] Oct 14 13:32:58 crc kubenswrapper[4837]: E1014 13:32:58.207966 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611b04f3-d9fa-4841-8cd5-608c99279890" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.208003 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="611b04f3-d9fa-4841-8cd5-608c99279890" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.208250 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="611b04f3-d9fa-4841-8cd5-608c99279890" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.209037 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.214796 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.215050 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.217133 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.217329 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.217575 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd"] Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.218302 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.218470 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.219228 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.219575 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304745 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304833 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304864 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304884 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304904 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.304931 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305196 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305222 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305246 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305262 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh94g\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-kube-api-access-sh94g\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.305292 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.406847 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.406908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.406941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.406982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407081 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407183 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407243 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407267 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh94g\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-kube-api-access-sh94g\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407805 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.407838 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.413643 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.413743 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.414227 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.414516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.414760 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.415478 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.415925 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.416615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.417272 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.417528 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.419138 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.422812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.423120 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.425732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh94g\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-kube-api-access-sh94g\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:58 crc kubenswrapper[4837]: I1014 13:32:58.535973 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:32:59 crc kubenswrapper[4837]: I1014 13:32:59.055264 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd"] Oct 14 13:32:59 crc kubenswrapper[4837]: I1014 13:32:59.119301 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" event={"ID":"97b10dd3-253f-47fa-ad50-4765f7139f4f","Type":"ContainerStarted","Data":"34ffc418e96be068e226dc697f338eb823e2d440d2c39b8cdd20f349ad9b0da6"} Oct 14 13:33:00 crc kubenswrapper[4837]: I1014 13:33:00.130909 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" event={"ID":"97b10dd3-253f-47fa-ad50-4765f7139f4f","Type":"ContainerStarted","Data":"ce94f49126ccaf03f7d7af2deee779f889759084ef49b1c344137397e73e1007"} Oct 14 13:33:00 crc kubenswrapper[4837]: I1014 13:33:00.156799 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" podStartSLOduration=1.6754069010000001 podStartE2EDuration="2.156780005s" podCreationTimestamp="2025-10-14 13:32:58 +0000 UTC" firstStartedPulling="2025-10-14 13:32:59.060562363 +0000 UTC m=+1916.977562176" lastFinishedPulling="2025-10-14 13:32:59.541935467 +0000 UTC m=+1917.458935280" observedRunningTime="2025-10-14 13:33:00.153902538 +0000 UTC m=+1918.070902391" watchObservedRunningTime="2025-10-14 13:33:00.156780005 +0000 UTC m=+1918.073779818" Oct 14 13:33:41 crc kubenswrapper[4837]: I1014 13:33:41.140052 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:33:41 crc kubenswrapper[4837]: I1014 13:33:41.140891 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:33:41 crc kubenswrapper[4837]: I1014 13:33:41.540004 4837 generic.go:334] "Generic (PLEG): container finished" podID="97b10dd3-253f-47fa-ad50-4765f7139f4f" containerID="ce94f49126ccaf03f7d7af2deee779f889759084ef49b1c344137397e73e1007" exitCode=0 Oct 14 13:33:41 crc kubenswrapper[4837]: I1014 13:33:41.540101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" event={"ID":"97b10dd3-253f-47fa-ad50-4765f7139f4f","Type":"ContainerDied","Data":"ce94f49126ccaf03f7d7af2deee779f889759084ef49b1c344137397e73e1007"} Oct 14 13:33:42 crc kubenswrapper[4837]: I1014 13:33:42.992196 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092273 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092310 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092336 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-inventory\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092365 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-nova-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-neutron-metadata-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ovn-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092518 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ssh-key\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-bootstrap-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh94g\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-kube-api-access-sh94g\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092598 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092635 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-telemetry-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-repo-setup-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.092755 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-libvirt-combined-ca-bundle\") pod \"97b10dd3-253f-47fa-ad50-4765f7139f4f\" (UID: \"97b10dd3-253f-47fa-ad50-4765f7139f4f\") " Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.100499 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.101216 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.101509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.101654 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.101806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.101845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.102309 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.102369 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-kube-api-access-sh94g" (OuterVolumeSpecName: "kube-api-access-sh94g") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "kube-api-access-sh94g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.104027 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.104110 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.108327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.110105 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.125366 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-inventory" (OuterVolumeSpecName: "inventory") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.127048 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "97b10dd3-253f-47fa-ad50-4765f7139f4f" (UID: "97b10dd3-253f-47fa-ad50-4765f7139f4f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195322 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195387 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195411 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195432 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195450 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195467 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh94g\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-kube-api-access-sh94g\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195488 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195507 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195524 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195541 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195559 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195577 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/97b10dd3-253f-47fa-ad50-4765f7139f4f-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195596 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.195613 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b10dd3-253f-47fa-ad50-4765f7139f4f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.562128 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" event={"ID":"97b10dd3-253f-47fa-ad50-4765f7139f4f","Type":"ContainerDied","Data":"34ffc418e96be068e226dc697f338eb823e2d440d2c39b8cdd20f349ad9b0da6"} Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.562390 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ffc418e96be068e226dc697f338eb823e2d440d2c39b8cdd20f349ad9b0da6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.562816 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.728344 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6"] Oct 14 13:33:43 crc kubenswrapper[4837]: E1014 13:33:43.729078 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b10dd3-253f-47fa-ad50-4765f7139f4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.729234 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b10dd3-253f-47fa-ad50-4765f7139f4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.729602 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b10dd3-253f-47fa-ad50-4765f7139f4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.730464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.736129 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.736478 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.736515 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.736804 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.736957 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.739851 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6"] Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.805663 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6f5e2181-b922-48d6-909c-ad1f87fee631-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.805723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.805751 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.805777 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2g9\" (UniqueName: \"kubernetes.io/projected/6f5e2181-b922-48d6-909c-ad1f87fee631-kube-api-access-hm2g9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.805809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.907130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6f5e2181-b922-48d6-909c-ad1f87fee631-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.907236 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.907267 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.907297 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2g9\" (UniqueName: \"kubernetes.io/projected/6f5e2181-b922-48d6-909c-ad1f87fee631-kube-api-access-hm2g9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.907322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.909037 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6f5e2181-b922-48d6-909c-ad1f87fee631-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.912945 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.913376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.913551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:43 crc kubenswrapper[4837]: I1014 13:33:43.930069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2g9\" (UniqueName: \"kubernetes.io/projected/6f5e2181-b922-48d6-909c-ad1f87fee631-kube-api-access-hm2g9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hc8z6\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:44 crc kubenswrapper[4837]: I1014 13:33:44.053234 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:33:44 crc kubenswrapper[4837]: I1014 13:33:44.568879 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6"] Oct 14 13:33:45 crc kubenswrapper[4837]: I1014 13:33:45.585346 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" event={"ID":"6f5e2181-b922-48d6-909c-ad1f87fee631","Type":"ContainerStarted","Data":"904937f6d4dc3260b95d488538aab5d25c6fbe1f6d41da6be268a94491cc71ce"} Oct 14 13:33:46 crc kubenswrapper[4837]: I1014 13:33:46.595319 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" event={"ID":"6f5e2181-b922-48d6-909c-ad1f87fee631","Type":"ContainerStarted","Data":"c59983a84de37fcad73f5888420eb732262b1adbf5fd46ecb88ddb34f7808332"} Oct 14 13:34:11 crc kubenswrapper[4837]: I1014 13:34:11.140624 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:34:11 crc kubenswrapper[4837]: I1014 13:34:11.142506 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:34:41 crc kubenswrapper[4837]: I1014 13:34:41.139724 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:34:41 crc kubenswrapper[4837]: I1014 13:34:41.143509 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:34:41 crc kubenswrapper[4837]: I1014 13:34:41.143575 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:34:41 crc kubenswrapper[4837]: I1014 13:34:41.144326 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b628675282f277238c95d5d491e72ef951faf707ba5860d9bb8d286e48707d31"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:34:41 crc kubenswrapper[4837]: I1014 13:34:41.144395 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://b628675282f277238c95d5d491e72ef951faf707ba5860d9bb8d286e48707d31" gracePeriod=600 Oct 14 13:34:42 crc kubenswrapper[4837]: I1014 13:34:42.165119 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="b628675282f277238c95d5d491e72ef951faf707ba5860d9bb8d286e48707d31" exitCode=0 Oct 14 13:34:42 crc kubenswrapper[4837]: I1014 13:34:42.165214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"b628675282f277238c95d5d491e72ef951faf707ba5860d9bb8d286e48707d31"} Oct 14 13:34:42 crc kubenswrapper[4837]: I1014 13:34:42.166854 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0"} Oct 14 13:34:42 crc kubenswrapper[4837]: I1014 13:34:42.166959 4837 scope.go:117] "RemoveContainer" containerID="2256effc4209a122c8655f36037b402d0edce391a12d8c75e3d5c225a5685b7b" Oct 14 13:34:42 crc kubenswrapper[4837]: I1014 13:34:42.198888 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" podStartSLOduration=58.202014631 podStartE2EDuration="59.198862312s" podCreationTimestamp="2025-10-14 13:33:43 +0000 UTC" firstStartedPulling="2025-10-14 13:33:44.573547528 +0000 UTC m=+1962.490547341" lastFinishedPulling="2025-10-14 13:33:45.570395199 +0000 UTC m=+1963.487395022" observedRunningTime="2025-10-14 13:33:46.61952671 +0000 UTC m=+1964.536526523" watchObservedRunningTime="2025-10-14 13:34:42.198862312 +0000 UTC m=+2020.115862145" Oct 14 13:34:50 crc kubenswrapper[4837]: I1014 13:34:50.264493 4837 generic.go:334] "Generic (PLEG): container finished" podID="6f5e2181-b922-48d6-909c-ad1f87fee631" containerID="c59983a84de37fcad73f5888420eb732262b1adbf5fd46ecb88ddb34f7808332" exitCode=0 Oct 14 13:34:50 crc kubenswrapper[4837]: I1014 13:34:50.264548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" event={"ID":"6f5e2181-b922-48d6-909c-ad1f87fee631","Type":"ContainerDied","Data":"c59983a84de37fcad73f5888420eb732262b1adbf5fd46ecb88ddb34f7808332"} Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.868143 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.977896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-inventory\") pod \"6f5e2181-b922-48d6-909c-ad1f87fee631\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.977946 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ssh-key\") pod \"6f5e2181-b922-48d6-909c-ad1f87fee631\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.977971 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6f5e2181-b922-48d6-909c-ad1f87fee631-ovncontroller-config-0\") pod \"6f5e2181-b922-48d6-909c-ad1f87fee631\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.978050 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm2g9\" (UniqueName: \"kubernetes.io/projected/6f5e2181-b922-48d6-909c-ad1f87fee631-kube-api-access-hm2g9\") pod \"6f5e2181-b922-48d6-909c-ad1f87fee631\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.978198 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ovn-combined-ca-bundle\") pod \"6f5e2181-b922-48d6-909c-ad1f87fee631\" (UID: \"6f5e2181-b922-48d6-909c-ad1f87fee631\") " Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.983203 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6f5e2181-b922-48d6-909c-ad1f87fee631" (UID: "6f5e2181-b922-48d6-909c-ad1f87fee631"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:34:51 crc kubenswrapper[4837]: I1014 13:34:51.985090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5e2181-b922-48d6-909c-ad1f87fee631-kube-api-access-hm2g9" (OuterVolumeSpecName: "kube-api-access-hm2g9") pod "6f5e2181-b922-48d6-909c-ad1f87fee631" (UID: "6f5e2181-b922-48d6-909c-ad1f87fee631"). InnerVolumeSpecName "kube-api-access-hm2g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.008382 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f5e2181-b922-48d6-909c-ad1f87fee631" (UID: "6f5e2181-b922-48d6-909c-ad1f87fee631"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.009350 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-inventory" (OuterVolumeSpecName: "inventory") pod "6f5e2181-b922-48d6-909c-ad1f87fee631" (UID: "6f5e2181-b922-48d6-909c-ad1f87fee631"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.025553 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5e2181-b922-48d6-909c-ad1f87fee631-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6f5e2181-b922-48d6-909c-ad1f87fee631" (UID: "6f5e2181-b922-48d6-909c-ad1f87fee631"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.080517 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.080568 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.080588 4837 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6f5e2181-b922-48d6-909c-ad1f87fee631-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.080605 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm2g9\" (UniqueName: \"kubernetes.io/projected/6f5e2181-b922-48d6-909c-ad1f87fee631-kube-api-access-hm2g9\") on node \"crc\" DevicePath \"\"" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.080624 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5e2181-b922-48d6-909c-ad1f87fee631-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.284263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" event={"ID":"6f5e2181-b922-48d6-909c-ad1f87fee631","Type":"ContainerDied","Data":"904937f6d4dc3260b95d488538aab5d25c6fbe1f6d41da6be268a94491cc71ce"} Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.284303 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="904937f6d4dc3260b95d488538aab5d25c6fbe1f6d41da6be268a94491cc71ce" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.284359 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hc8z6" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.389190 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h"] Oct 14 13:34:52 crc kubenswrapper[4837]: E1014 13:34:52.389538 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5e2181-b922-48d6-909c-ad1f87fee631" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.389555 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5e2181-b922-48d6-909c-ad1f87fee631" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.389726 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5e2181-b922-48d6-909c-ad1f87fee631" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.390335 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.393837 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.393872 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.394126 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.394673 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.395411 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.396354 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.412785 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h"] Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.487297 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.487352 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dpn\" (UniqueName: \"kubernetes.io/projected/e0186e2a-7938-4646-ba9c-768d75c09605-kube-api-access-w2dpn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.487921 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.488082 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.488248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.488316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.591386 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.591683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.592436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.592496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.592546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dpn\" (UniqueName: \"kubernetes.io/projected/e0186e2a-7938-4646-ba9c-768d75c09605-kube-api-access-w2dpn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.592648 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.595578 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.595947 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.597135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.598816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.599873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.616976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dpn\" (UniqueName: \"kubernetes.io/projected/e0186e2a-7938-4646-ba9c-768d75c09605-kube-api-access-w2dpn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:52 crc kubenswrapper[4837]: I1014 13:34:52.712891 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:34:53 crc kubenswrapper[4837]: I1014 13:34:53.278964 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h"] Oct 14 13:34:53 crc kubenswrapper[4837]: W1014 13:34:53.293882 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0186e2a_7938_4646_ba9c_768d75c09605.slice/crio-f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8 WatchSource:0}: Error finding container f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8: Status 404 returned error can't find the container with id f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8 Oct 14 13:34:54 crc kubenswrapper[4837]: I1014 13:34:54.305131 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" event={"ID":"e0186e2a-7938-4646-ba9c-768d75c09605","Type":"ContainerStarted","Data":"4ea76d962804613fb6d8a8f26045db877b51b7d311223145a3cf906c7ad94115"} Oct 14 13:34:54 crc kubenswrapper[4837]: I1014 13:34:54.307492 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" event={"ID":"e0186e2a-7938-4646-ba9c-768d75c09605","Type":"ContainerStarted","Data":"f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8"} Oct 14 13:34:54 crc kubenswrapper[4837]: I1014 13:34:54.324481 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" podStartSLOduration=1.772245163 podStartE2EDuration="2.324462613s" podCreationTimestamp="2025-10-14 13:34:52 +0000 UTC" firstStartedPulling="2025-10-14 13:34:53.297702976 +0000 UTC m=+2031.214702799" lastFinishedPulling="2025-10-14 13:34:53.849920426 +0000 UTC m=+2031.766920249" observedRunningTime="2025-10-14 13:34:54.319597981 +0000 UTC m=+2032.236597804" watchObservedRunningTime="2025-10-14 13:34:54.324462613 +0000 UTC m=+2032.241462426" Oct 14 13:35:33 crc kubenswrapper[4837]: I1014 13:35:33.727113 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kx8q8"] Oct 14 13:35:33 crc kubenswrapper[4837]: I1014 13:35:33.729563 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:33 crc kubenswrapper[4837]: I1014 13:35:33.748081 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kx8q8"] Oct 14 13:35:33 crc kubenswrapper[4837]: I1014 13:35:33.908101 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghnf\" (UniqueName: \"kubernetes.io/projected/551c0816-f611-47c1-8e0b-b63c15e7d5af-kube-api-access-5ghnf\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:33 crc kubenswrapper[4837]: I1014 13:35:33.908494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-catalog-content\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:33 crc kubenswrapper[4837]: I1014 13:35:33.908714 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-utilities\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.010043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghnf\" (UniqueName: \"kubernetes.io/projected/551c0816-f611-47c1-8e0b-b63c15e7d5af-kube-api-access-5ghnf\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.010101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-catalog-content\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.010178 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-utilities\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.010693 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-utilities\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.010942 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-catalog-content\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.032849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghnf\" (UniqueName: \"kubernetes.io/projected/551c0816-f611-47c1-8e0b-b63c15e7d5af-kube-api-access-5ghnf\") pod \"redhat-marketplace-kx8q8\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.054606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.507712 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kx8q8"] Oct 14 13:35:34 crc kubenswrapper[4837]: I1014 13:35:34.718589 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kx8q8" event={"ID":"551c0816-f611-47c1-8e0b-b63c15e7d5af","Type":"ContainerStarted","Data":"37299c229d51d8247993bb07acaf662595a1d1d0817192707c98a6f3ee1c184c"} Oct 14 13:35:35 crc kubenswrapper[4837]: I1014 13:35:35.732015 4837 generic.go:334] "Generic (PLEG): container finished" podID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerID="11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58" exitCode=0 Oct 14 13:35:35 crc kubenswrapper[4837]: I1014 13:35:35.732149 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kx8q8" event={"ID":"551c0816-f611-47c1-8e0b-b63c15e7d5af","Type":"ContainerDied","Data":"11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58"} Oct 14 13:35:36 crc kubenswrapper[4837]: I1014 13:35:36.740932 4837 generic.go:334] "Generic (PLEG): container finished" podID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerID="b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1" exitCode=0 Oct 14 13:35:36 crc kubenswrapper[4837]: I1014 13:35:36.741007 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kx8q8" event={"ID":"551c0816-f611-47c1-8e0b-b63c15e7d5af","Type":"ContainerDied","Data":"b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1"} Oct 14 13:35:37 crc kubenswrapper[4837]: I1014 13:35:37.752577 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kx8q8" event={"ID":"551c0816-f611-47c1-8e0b-b63c15e7d5af","Type":"ContainerStarted","Data":"1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d"} Oct 14 13:35:37 crc kubenswrapper[4837]: I1014 13:35:37.780705 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kx8q8" podStartSLOduration=3.320159738 podStartE2EDuration="4.780685787s" podCreationTimestamp="2025-10-14 13:35:33 +0000 UTC" firstStartedPulling="2025-10-14 13:35:35.734284318 +0000 UTC m=+2073.651284131" lastFinishedPulling="2025-10-14 13:35:37.194810377 +0000 UTC m=+2075.111810180" observedRunningTime="2025-10-14 13:35:37.772220598 +0000 UTC m=+2075.689220411" watchObservedRunningTime="2025-10-14 13:35:37.780685787 +0000 UTC m=+2075.697685600" Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.055643 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.056657 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.108552 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.854938 4837 generic.go:334] "Generic (PLEG): container finished" podID="e0186e2a-7938-4646-ba9c-768d75c09605" containerID="4ea76d962804613fb6d8a8f26045db877b51b7d311223145a3cf906c7ad94115" exitCode=0 Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.855039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" event={"ID":"e0186e2a-7938-4646-ba9c-768d75c09605","Type":"ContainerDied","Data":"4ea76d962804613fb6d8a8f26045db877b51b7d311223145a3cf906c7ad94115"} Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.911418 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:44 crc kubenswrapper[4837]: I1014 13:35:44.971633 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kx8q8"] Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.267788 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.465230 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2dpn\" (UniqueName: \"kubernetes.io/projected/e0186e2a-7938-4646-ba9c-768d75c09605-kube-api-access-w2dpn\") pod \"e0186e2a-7938-4646-ba9c-768d75c09605\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.465986 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-metadata-combined-ca-bundle\") pod \"e0186e2a-7938-4646-ba9c-768d75c09605\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.466181 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e0186e2a-7938-4646-ba9c-768d75c09605\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.466346 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-ssh-key\") pod \"e0186e2a-7938-4646-ba9c-768d75c09605\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.466553 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-inventory\") pod \"e0186e2a-7938-4646-ba9c-768d75c09605\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.466740 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-nova-metadata-neutron-config-0\") pod \"e0186e2a-7938-4646-ba9c-768d75c09605\" (UID: \"e0186e2a-7938-4646-ba9c-768d75c09605\") " Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.475506 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e0186e2a-7938-4646-ba9c-768d75c09605" (UID: "e0186e2a-7938-4646-ba9c-768d75c09605"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.475530 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0186e2a-7938-4646-ba9c-768d75c09605-kube-api-access-w2dpn" (OuterVolumeSpecName: "kube-api-access-w2dpn") pod "e0186e2a-7938-4646-ba9c-768d75c09605" (UID: "e0186e2a-7938-4646-ba9c-768d75c09605"). InnerVolumeSpecName "kube-api-access-w2dpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.495087 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e0186e2a-7938-4646-ba9c-768d75c09605" (UID: "e0186e2a-7938-4646-ba9c-768d75c09605"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.512803 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0186e2a-7938-4646-ba9c-768d75c09605" (UID: "e0186e2a-7938-4646-ba9c-768d75c09605"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.513306 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e0186e2a-7938-4646-ba9c-768d75c09605" (UID: "e0186e2a-7938-4646-ba9c-768d75c09605"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.523553 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-inventory" (OuterVolumeSpecName: "inventory") pod "e0186e2a-7938-4646-ba9c-768d75c09605" (UID: "e0186e2a-7938-4646-ba9c-768d75c09605"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.569649 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.569691 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.569705 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2dpn\" (UniqueName: \"kubernetes.io/projected/e0186e2a-7938-4646-ba9c-768d75c09605-kube-api-access-w2dpn\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.569727 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.569742 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.569755 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0186e2a-7938-4646-ba9c-768d75c09605-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.774932 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pdpv9"] Oct 14 13:35:46 crc kubenswrapper[4837]: E1014 13:35:46.775416 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0186e2a-7938-4646-ba9c-768d75c09605" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.775440 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0186e2a-7938-4646-ba9c-768d75c09605" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.775678 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0186e2a-7938-4646-ba9c-768d75c09605" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.777706 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.783514 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pdpv9"] Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.878883 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69m8h\" (UniqueName: \"kubernetes.io/projected/2acb87d9-108c-4140-9910-9bf140f27bc2-kube-api-access-69m8h\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.878922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-utilities\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.878941 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-catalog-content\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.879923 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" event={"ID":"e0186e2a-7938-4646-ba9c-768d75c09605","Type":"ContainerDied","Data":"f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8"} Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.879967 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.879939 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h" Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.880265 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kx8q8" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="registry-server" containerID="cri-o://1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d" gracePeriod=2 Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.988357 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf"] Oct 14 13:35:46 crc kubenswrapper[4837]: I1014 13:35:46.996763 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.001787 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.003565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.003793 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.003841 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.004683 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.012837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69m8h\" (UniqueName: \"kubernetes.io/projected/2acb87d9-108c-4140-9910-9bf140f27bc2-kube-api-access-69m8h\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.012903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-utilities\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.012946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-catalog-content\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.013642 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-catalog-content\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.013877 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-utilities\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.027046 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf"] Oct 14 13:35:47 crc kubenswrapper[4837]: E1014 13:35:47.027194 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod551c0816_f611_47c1_8e0b_b63c15e7d5af.slice/crio-conmon-1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0186e2a_7938_4646_ba9c_768d75c09605.slice/crio-f9197b824f5dd18762e217694c74bace45b70e9299dad65c47374c81c35713c8\": RecentStats: unable to find data in memory cache]" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.036531 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69m8h\" (UniqueName: \"kubernetes.io/projected/2acb87d9-108c-4140-9910-9bf140f27bc2-kube-api-access-69m8h\") pod \"community-operators-pdpv9\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.119699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.120292 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.120504 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.120646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72q2g\" (UniqueName: \"kubernetes.io/projected/aacf282f-f2c7-447d-9e73-98a35898f8df-kube-api-access-72q2g\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.120747 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.121147 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.245084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.245143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.245199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.245232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72q2g\" (UniqueName: \"kubernetes.io/projected/aacf282f-f2c7-447d-9e73-98a35898f8df-kube-api-access-72q2g\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.245254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.255907 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.267106 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.267516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.275017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.291219 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72q2g\" (UniqueName: \"kubernetes.io/projected/aacf282f-f2c7-447d-9e73-98a35898f8df-kube-api-access-72q2g\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-78vdf\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.363510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.419110 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.450061 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ghnf\" (UniqueName: \"kubernetes.io/projected/551c0816-f611-47c1-8e0b-b63c15e7d5af-kube-api-access-5ghnf\") pod \"551c0816-f611-47c1-8e0b-b63c15e7d5af\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.450141 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-catalog-content\") pod \"551c0816-f611-47c1-8e0b-b63c15e7d5af\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.450243 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-utilities\") pod \"551c0816-f611-47c1-8e0b-b63c15e7d5af\" (UID: \"551c0816-f611-47c1-8e0b-b63c15e7d5af\") " Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.454606 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-utilities" (OuterVolumeSpecName: "utilities") pod "551c0816-f611-47c1-8e0b-b63c15e7d5af" (UID: "551c0816-f611-47c1-8e0b-b63c15e7d5af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.461208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551c0816-f611-47c1-8e0b-b63c15e7d5af-kube-api-access-5ghnf" (OuterVolumeSpecName: "kube-api-access-5ghnf") pod "551c0816-f611-47c1-8e0b-b63c15e7d5af" (UID: "551c0816-f611-47c1-8e0b-b63c15e7d5af"). InnerVolumeSpecName "kube-api-access-5ghnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.466461 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "551c0816-f611-47c1-8e0b-b63c15e7d5af" (UID: "551c0816-f611-47c1-8e0b-b63c15e7d5af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.552741 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ghnf\" (UniqueName: \"kubernetes.io/projected/551c0816-f611-47c1-8e0b-b63c15e7d5af-kube-api-access-5ghnf\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.552770 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.552780 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551c0816-f611-47c1-8e0b-b63c15e7d5af-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.765105 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pdpv9"] Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.893028 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerStarted","Data":"2da8abc1f229c54f8ea73486e2a1788c5f21da5a4cbe38783558baf594ef4b7a"} Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.896315 4837 generic.go:334] "Generic (PLEG): container finished" podID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerID="1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d" exitCode=0 Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.896356 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kx8q8" event={"ID":"551c0816-f611-47c1-8e0b-b63c15e7d5af","Type":"ContainerDied","Data":"1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d"} Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.896382 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kx8q8" event={"ID":"551c0816-f611-47c1-8e0b-b63c15e7d5af","Type":"ContainerDied","Data":"37299c229d51d8247993bb07acaf662595a1d1d0817192707c98a6f3ee1c184c"} Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.896400 4837 scope.go:117] "RemoveContainer" containerID="1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.896539 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kx8q8" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.920501 4837 scope.go:117] "RemoveContainer" containerID="b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.934868 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kx8q8"] Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.944449 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kx8q8"] Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.947439 4837 scope.go:117] "RemoveContainer" containerID="11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.966654 4837 scope.go:117] "RemoveContainer" containerID="1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d" Oct 14 13:35:47 crc kubenswrapper[4837]: E1014 13:35:47.967068 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d\": container with ID starting with 1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d not found: ID does not exist" containerID="1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.967113 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d"} err="failed to get container status \"1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d\": rpc error: code = NotFound desc = could not find container \"1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d\": container with ID starting with 1a4a3bfaa5d87e954d4f507189e148ff1d4bda1e857940bf4e42cd5f5d41694d not found: ID does not exist" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.967142 4837 scope.go:117] "RemoveContainer" containerID="b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1" Oct 14 13:35:47 crc kubenswrapper[4837]: E1014 13:35:47.967678 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1\": container with ID starting with b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1 not found: ID does not exist" containerID="b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.967723 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1"} err="failed to get container status \"b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1\": rpc error: code = NotFound desc = could not find container \"b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1\": container with ID starting with b333a2a2870a9f9d898ce1074ac67393e71b71d8e8ce064b7b1e8d464744abf1 not found: ID does not exist" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.967751 4837 scope.go:117] "RemoveContainer" containerID="11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58" Oct 14 13:35:47 crc kubenswrapper[4837]: E1014 13:35:47.969517 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58\": container with ID starting with 11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58 not found: ID does not exist" containerID="11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.969559 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58"} err="failed to get container status \"11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58\": rpc error: code = NotFound desc = could not find container \"11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58\": container with ID starting with 11b63571e1605387b0fce2f1d374646a6b991ee6ee23556ad25f68b4ea272b58 not found: ID does not exist" Oct 14 13:35:47 crc kubenswrapper[4837]: I1014 13:35:47.977782 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf"] Oct 14 13:35:48 crc kubenswrapper[4837]: W1014 13:35:48.039618 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf282f_f2c7_447d_9e73_98a35898f8df.slice/crio-eef45e922a9c90babb38ea58c297fbbd7c82ddde8e279867cf13d08c6726b7d9 WatchSource:0}: Error finding container eef45e922a9c90babb38ea58c297fbbd7c82ddde8e279867cf13d08c6726b7d9: Status 404 returned error can't find the container with id eef45e922a9c90babb38ea58c297fbbd7c82ddde8e279867cf13d08c6726b7d9 Oct 14 13:35:48 crc kubenswrapper[4837]: I1014 13:35:48.806362 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" path="/var/lib/kubelet/pods/551c0816-f611-47c1-8e0b-b63c15e7d5af/volumes" Oct 14 13:35:48 crc kubenswrapper[4837]: I1014 13:35:48.909654 4837 generic.go:334] "Generic (PLEG): container finished" podID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerID="0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce" exitCode=0 Oct 14 13:35:48 crc kubenswrapper[4837]: I1014 13:35:48.909768 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerDied","Data":"0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce"} Oct 14 13:35:48 crc kubenswrapper[4837]: I1014 13:35:48.912629 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" event={"ID":"aacf282f-f2c7-447d-9e73-98a35898f8df","Type":"ContainerStarted","Data":"05b33cd0dad6c0b28a38fe63bbe124ff65f7ef7f7a91fca2c7315e6440f01501"} Oct 14 13:35:48 crc kubenswrapper[4837]: I1014 13:35:48.912656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" event={"ID":"aacf282f-f2c7-447d-9e73-98a35898f8df","Type":"ContainerStarted","Data":"eef45e922a9c90babb38ea58c297fbbd7c82ddde8e279867cf13d08c6726b7d9"} Oct 14 13:35:48 crc kubenswrapper[4837]: I1014 13:35:48.951896 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" podStartSLOduration=2.380671087 podStartE2EDuration="2.951869961s" podCreationTimestamp="2025-10-14 13:35:46 +0000 UTC" firstStartedPulling="2025-10-14 13:35:48.041719461 +0000 UTC m=+2085.958719264" lastFinishedPulling="2025-10-14 13:35:48.612918315 +0000 UTC m=+2086.529918138" observedRunningTime="2025-10-14 13:35:48.950208355 +0000 UTC m=+2086.867208168" watchObservedRunningTime="2025-10-14 13:35:48.951869961 +0000 UTC m=+2086.868869814" Oct 14 13:35:49 crc kubenswrapper[4837]: I1014 13:35:49.924277 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerStarted","Data":"93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43"} Oct 14 13:35:50 crc kubenswrapper[4837]: I1014 13:35:50.934942 4837 generic.go:334] "Generic (PLEG): container finished" podID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerID="93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43" exitCode=0 Oct 14 13:35:50 crc kubenswrapper[4837]: I1014 13:35:50.935200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerDied","Data":"93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43"} Oct 14 13:35:51 crc kubenswrapper[4837]: I1014 13:35:51.945690 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerStarted","Data":"5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0"} Oct 14 13:35:51 crc kubenswrapper[4837]: I1014 13:35:51.960750 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pdpv9" podStartSLOduration=3.401405801 podStartE2EDuration="5.960728326s" podCreationTimestamp="2025-10-14 13:35:46 +0000 UTC" firstStartedPulling="2025-10-14 13:35:48.911393215 +0000 UTC m=+2086.828393048" lastFinishedPulling="2025-10-14 13:35:51.47071574 +0000 UTC m=+2089.387715573" observedRunningTime="2025-10-14 13:35:51.958852634 +0000 UTC m=+2089.875852457" watchObservedRunningTime="2025-10-14 13:35:51.960728326 +0000 UTC m=+2089.877728139" Oct 14 13:35:57 crc kubenswrapper[4837]: I1014 13:35:57.121709 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:57 crc kubenswrapper[4837]: I1014 13:35:57.123383 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:57 crc kubenswrapper[4837]: I1014 13:35:57.196221 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:58 crc kubenswrapper[4837]: I1014 13:35:58.074139 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:35:58 crc kubenswrapper[4837]: I1014 13:35:58.138629 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pdpv9"] Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.021476 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pdpv9" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="registry-server" containerID="cri-o://5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0" gracePeriod=2 Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.538121 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.709023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-catalog-content\") pod \"2acb87d9-108c-4140-9910-9bf140f27bc2\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.709201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69m8h\" (UniqueName: \"kubernetes.io/projected/2acb87d9-108c-4140-9910-9bf140f27bc2-kube-api-access-69m8h\") pod \"2acb87d9-108c-4140-9910-9bf140f27bc2\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.709247 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-utilities\") pod \"2acb87d9-108c-4140-9910-9bf140f27bc2\" (UID: \"2acb87d9-108c-4140-9910-9bf140f27bc2\") " Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.710070 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-utilities" (OuterVolumeSpecName: "utilities") pod "2acb87d9-108c-4140-9910-9bf140f27bc2" (UID: "2acb87d9-108c-4140-9910-9bf140f27bc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.720679 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acb87d9-108c-4140-9910-9bf140f27bc2-kube-api-access-69m8h" (OuterVolumeSpecName: "kube-api-access-69m8h") pod "2acb87d9-108c-4140-9910-9bf140f27bc2" (UID: "2acb87d9-108c-4140-9910-9bf140f27bc2"). InnerVolumeSpecName "kube-api-access-69m8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.753452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2acb87d9-108c-4140-9910-9bf140f27bc2" (UID: "2acb87d9-108c-4140-9910-9bf140f27bc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.813694 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.813754 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69m8h\" (UniqueName: \"kubernetes.io/projected/2acb87d9-108c-4140-9910-9bf140f27bc2-kube-api-access-69m8h\") on node \"crc\" DevicePath \"\"" Oct 14 13:36:00 crc kubenswrapper[4837]: I1014 13:36:00.813783 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acb87d9-108c-4140-9910-9bf140f27bc2-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.032137 4837 generic.go:334] "Generic (PLEG): container finished" podID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerID="5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0" exitCode=0 Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.032203 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerDied","Data":"5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0"} Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.032214 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdpv9" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.032231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdpv9" event={"ID":"2acb87d9-108c-4140-9910-9bf140f27bc2","Type":"ContainerDied","Data":"2da8abc1f229c54f8ea73486e2a1788c5f21da5a4cbe38783558baf594ef4b7a"} Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.032248 4837 scope.go:117] "RemoveContainer" containerID="5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.068406 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pdpv9"] Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.075148 4837 scope.go:117] "RemoveContainer" containerID="93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.082459 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pdpv9"] Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.108388 4837 scope.go:117] "RemoveContainer" containerID="0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.144342 4837 scope.go:117] "RemoveContainer" containerID="5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0" Oct 14 13:36:01 crc kubenswrapper[4837]: E1014 13:36:01.144756 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0\": container with ID starting with 5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0 not found: ID does not exist" containerID="5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.144793 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0"} err="failed to get container status \"5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0\": rpc error: code = NotFound desc = could not find container \"5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0\": container with ID starting with 5d0ad2ec341f63bbafa89138af05da4edd4ac27ae7c0bc842d37c77d2cb67ee0 not found: ID does not exist" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.144818 4837 scope.go:117] "RemoveContainer" containerID="93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43" Oct 14 13:36:01 crc kubenswrapper[4837]: E1014 13:36:01.145281 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43\": container with ID starting with 93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43 not found: ID does not exist" containerID="93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.145307 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43"} err="failed to get container status \"93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43\": rpc error: code = NotFound desc = could not find container \"93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43\": container with ID starting with 93334dfd967cb6d3570dca2193c14b3f498a1eb73a467611d6b4c99f372a7f43 not found: ID does not exist" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.145320 4837 scope.go:117] "RemoveContainer" containerID="0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce" Oct 14 13:36:01 crc kubenswrapper[4837]: E1014 13:36:01.145601 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce\": container with ID starting with 0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce not found: ID does not exist" containerID="0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce" Oct 14 13:36:01 crc kubenswrapper[4837]: I1014 13:36:01.145621 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce"} err="failed to get container status \"0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce\": rpc error: code = NotFound desc = could not find container \"0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce\": container with ID starting with 0fdaeb20dc33de3bc3d01616558a47d7e1c4309e91b250c9c06d9a0c9eab06ce not found: ID does not exist" Oct 14 13:36:02 crc kubenswrapper[4837]: I1014 13:36:02.810956 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" path="/var/lib/kubelet/pods/2acb87d9-108c-4140-9910-9bf140f27bc2/volumes" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.181959 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nk4kp"] Oct 14 13:36:21 crc kubenswrapper[4837]: E1014 13:36:21.185427 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="extract-utilities" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185450 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="extract-utilities" Oct 14 13:36:21 crc kubenswrapper[4837]: E1014 13:36:21.185470 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="extract-utilities" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185476 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="extract-utilities" Oct 14 13:36:21 crc kubenswrapper[4837]: E1014 13:36:21.185492 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="registry-server" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185498 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="registry-server" Oct 14 13:36:21 crc kubenswrapper[4837]: E1014 13:36:21.185528 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="extract-content" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185534 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="extract-content" Oct 14 13:36:21 crc kubenswrapper[4837]: E1014 13:36:21.185549 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="registry-server" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185555 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="registry-server" Oct 14 13:36:21 crc kubenswrapper[4837]: E1014 13:36:21.185563 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="extract-content" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185568 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="extract-content" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185808 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acb87d9-108c-4140-9910-9bf140f27bc2" containerName="registry-server" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.185834 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="551c0816-f611-47c1-8e0b-b63c15e7d5af" containerName="registry-server" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.187177 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.196462 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nk4kp"] Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.334973 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-utilities\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.335069 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-catalog-content\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.335145 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgrvl\" (UniqueName: \"kubernetes.io/projected/29f15b18-ec22-451c-a974-5acdf2d683c7-kube-api-access-zgrvl\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.436948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-catalog-content\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.437534 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgrvl\" (UniqueName: \"kubernetes.io/projected/29f15b18-ec22-451c-a974-5acdf2d683c7-kube-api-access-zgrvl\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.437599 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-utilities\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.437851 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-utilities\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.437431 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-catalog-content\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.460601 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgrvl\" (UniqueName: \"kubernetes.io/projected/29f15b18-ec22-451c-a974-5acdf2d683c7-kube-api-access-zgrvl\") pod \"certified-operators-nk4kp\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.514697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:21 crc kubenswrapper[4837]: I1014 13:36:21.816130 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nk4kp"] Oct 14 13:36:22 crc kubenswrapper[4837]: I1014 13:36:22.253664 4837 generic.go:334] "Generic (PLEG): container finished" podID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerID="0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302" exitCode=0 Oct 14 13:36:22 crc kubenswrapper[4837]: I1014 13:36:22.253755 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerDied","Data":"0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302"} Oct 14 13:36:22 crc kubenswrapper[4837]: I1014 13:36:22.254014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerStarted","Data":"751c709a3be696a8d35d500131ebda78ef20051c3adf9ff53a72b4ecd22d2c60"} Oct 14 13:36:24 crc kubenswrapper[4837]: I1014 13:36:24.277752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerStarted","Data":"695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a"} Oct 14 13:36:25 crc kubenswrapper[4837]: I1014 13:36:25.294453 4837 generic.go:334] "Generic (PLEG): container finished" podID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerID="695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a" exitCode=0 Oct 14 13:36:25 crc kubenswrapper[4837]: I1014 13:36:25.294527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerDied","Data":"695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a"} Oct 14 13:36:25 crc kubenswrapper[4837]: I1014 13:36:25.297416 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:36:26 crc kubenswrapper[4837]: I1014 13:36:26.313204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerStarted","Data":"834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f"} Oct 14 13:36:26 crc kubenswrapper[4837]: I1014 13:36:26.342789 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nk4kp" podStartSLOduration=1.6585787490000001 podStartE2EDuration="5.342771276s" podCreationTimestamp="2025-10-14 13:36:21 +0000 UTC" firstStartedPulling="2025-10-14 13:36:22.255669431 +0000 UTC m=+2120.172669244" lastFinishedPulling="2025-10-14 13:36:25.939861918 +0000 UTC m=+2123.856861771" observedRunningTime="2025-10-14 13:36:26.335299754 +0000 UTC m=+2124.252299577" watchObservedRunningTime="2025-10-14 13:36:26.342771276 +0000 UTC m=+2124.259771089" Oct 14 13:36:31 crc kubenswrapper[4837]: I1014 13:36:31.515348 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:31 crc kubenswrapper[4837]: I1014 13:36:31.516460 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:31 crc kubenswrapper[4837]: I1014 13:36:31.574670 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:32 crc kubenswrapper[4837]: I1014 13:36:32.419515 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:32 crc kubenswrapper[4837]: I1014 13:36:32.462590 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nk4kp"] Oct 14 13:36:34 crc kubenswrapper[4837]: I1014 13:36:34.404778 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nk4kp" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="registry-server" containerID="cri-o://834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f" gracePeriod=2 Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.369138 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.412005 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-catalog-content\") pod \"29f15b18-ec22-451c-a974-5acdf2d683c7\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.412081 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgrvl\" (UniqueName: \"kubernetes.io/projected/29f15b18-ec22-451c-a974-5acdf2d683c7-kube-api-access-zgrvl\") pod \"29f15b18-ec22-451c-a974-5acdf2d683c7\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.412312 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-utilities\") pod \"29f15b18-ec22-451c-a974-5acdf2d683c7\" (UID: \"29f15b18-ec22-451c-a974-5acdf2d683c7\") " Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.415809 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-utilities" (OuterVolumeSpecName: "utilities") pod "29f15b18-ec22-451c-a974-5acdf2d683c7" (UID: "29f15b18-ec22-451c-a974-5acdf2d683c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.416918 4837 generic.go:334] "Generic (PLEG): container finished" podID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerID="834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f" exitCode=0 Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.416962 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerDied","Data":"834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f"} Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.416998 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nk4kp" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.417017 4837 scope.go:117] "RemoveContainer" containerID="834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.417003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nk4kp" event={"ID":"29f15b18-ec22-451c-a974-5acdf2d683c7","Type":"ContainerDied","Data":"751c709a3be696a8d35d500131ebda78ef20051c3adf9ff53a72b4ecd22d2c60"} Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.439669 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f15b18-ec22-451c-a974-5acdf2d683c7-kube-api-access-zgrvl" (OuterVolumeSpecName: "kube-api-access-zgrvl") pod "29f15b18-ec22-451c-a974-5acdf2d683c7" (UID: "29f15b18-ec22-451c-a974-5acdf2d683c7"). InnerVolumeSpecName "kube-api-access-zgrvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.472235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29f15b18-ec22-451c-a974-5acdf2d683c7" (UID: "29f15b18-ec22-451c-a974-5acdf2d683c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.478855 4837 scope.go:117] "RemoveContainer" containerID="695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.505661 4837 scope.go:117] "RemoveContainer" containerID="0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.514354 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.514385 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgrvl\" (UniqueName: \"kubernetes.io/projected/29f15b18-ec22-451c-a974-5acdf2d683c7-kube-api-access-zgrvl\") on node \"crc\" DevicePath \"\"" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.514394 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29f15b18-ec22-451c-a974-5acdf2d683c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.548076 4837 scope.go:117] "RemoveContainer" containerID="834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f" Oct 14 13:36:35 crc kubenswrapper[4837]: E1014 13:36:35.548656 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f\": container with ID starting with 834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f not found: ID does not exist" containerID="834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.548683 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f"} err="failed to get container status \"834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f\": rpc error: code = NotFound desc = could not find container \"834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f\": container with ID starting with 834718cad356bbe707317c72138b1fc9098d886897999e66b62b7945a048484f not found: ID does not exist" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.548702 4837 scope.go:117] "RemoveContainer" containerID="695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a" Oct 14 13:36:35 crc kubenswrapper[4837]: E1014 13:36:35.549068 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a\": container with ID starting with 695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a not found: ID does not exist" containerID="695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.549125 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a"} err="failed to get container status \"695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a\": rpc error: code = NotFound desc = could not find container \"695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a\": container with ID starting with 695659cf6db4aed439b03a9b3b527cede1b924e99fe9a764b62255d80779f24a not found: ID does not exist" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.549173 4837 scope.go:117] "RemoveContainer" containerID="0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302" Oct 14 13:36:35 crc kubenswrapper[4837]: E1014 13:36:35.549575 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302\": container with ID starting with 0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302 not found: ID does not exist" containerID="0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.549697 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302"} err="failed to get container status \"0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302\": rpc error: code = NotFound desc = could not find container \"0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302\": container with ID starting with 0a39b3e1e4aab04c95dc62ff82e69d6449d4e2f317b9c58928a9a6cd71716302 not found: ID does not exist" Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.747793 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nk4kp"] Oct 14 13:36:35 crc kubenswrapper[4837]: I1014 13:36:35.758912 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nk4kp"] Oct 14 13:36:36 crc kubenswrapper[4837]: I1014 13:36:36.801572 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" path="/var/lib/kubelet/pods/29f15b18-ec22-451c-a974-5acdf2d683c7/volumes" Oct 14 13:36:41 crc kubenswrapper[4837]: I1014 13:36:41.140462 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:36:41 crc kubenswrapper[4837]: I1014 13:36:41.140897 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:37:11 crc kubenswrapper[4837]: I1014 13:37:11.140636 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:37:11 crc kubenswrapper[4837]: I1014 13:37:11.141465 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:37:41 crc kubenswrapper[4837]: I1014 13:37:41.139785 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:37:41 crc kubenswrapper[4837]: I1014 13:37:41.140524 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:37:41 crc kubenswrapper[4837]: I1014 13:37:41.140597 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:37:41 crc kubenswrapper[4837]: I1014 13:37:41.141626 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:37:41 crc kubenswrapper[4837]: I1014 13:37:41.141720 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" gracePeriod=600 Oct 14 13:37:41 crc kubenswrapper[4837]: E1014 13:37:41.264588 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:37:42 crc kubenswrapper[4837]: I1014 13:37:42.151062 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" exitCode=0 Oct 14 13:37:42 crc kubenswrapper[4837]: I1014 13:37:42.151238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0"} Oct 14 13:37:42 crc kubenswrapper[4837]: I1014 13:37:42.151580 4837 scope.go:117] "RemoveContainer" containerID="b628675282f277238c95d5d491e72ef951faf707ba5860d9bb8d286e48707d31" Oct 14 13:37:42 crc kubenswrapper[4837]: I1014 13:37:42.152483 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:37:42 crc kubenswrapper[4837]: E1014 13:37:42.152971 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:37:56 crc kubenswrapper[4837]: I1014 13:37:56.785690 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:37:56 crc kubenswrapper[4837]: E1014 13:37:56.787064 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:38:07 crc kubenswrapper[4837]: I1014 13:38:07.785080 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:38:07 crc kubenswrapper[4837]: E1014 13:38:07.785906 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:38:20 crc kubenswrapper[4837]: I1014 13:38:20.786040 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:38:20 crc kubenswrapper[4837]: E1014 13:38:20.787148 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:38:33 crc kubenswrapper[4837]: I1014 13:38:33.784287 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:38:33 crc kubenswrapper[4837]: E1014 13:38:33.785395 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:38:47 crc kubenswrapper[4837]: I1014 13:38:47.787228 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:38:47 crc kubenswrapper[4837]: E1014 13:38:47.788412 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:39:00 crc kubenswrapper[4837]: I1014 13:39:00.784285 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:39:00 crc kubenswrapper[4837]: E1014 13:39:00.785248 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:39:11 crc kubenswrapper[4837]: I1014 13:39:11.785141 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:39:11 crc kubenswrapper[4837]: E1014 13:39:11.785987 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:39:23 crc kubenswrapper[4837]: I1014 13:39:23.785739 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:39:23 crc kubenswrapper[4837]: E1014 13:39:23.786807 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:39:34 crc kubenswrapper[4837]: I1014 13:39:34.786205 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:39:34 crc kubenswrapper[4837]: E1014 13:39:34.787502 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:39:46 crc kubenswrapper[4837]: I1014 13:39:46.785607 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:39:46 crc kubenswrapper[4837]: E1014 13:39:46.786618 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:39:58 crc kubenswrapper[4837]: I1014 13:39:58.785252 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:39:58 crc kubenswrapper[4837]: E1014 13:39:58.786095 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:40:09 crc kubenswrapper[4837]: I1014 13:40:09.785609 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:40:09 crc kubenswrapper[4837]: E1014 13:40:09.786914 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:40:13 crc kubenswrapper[4837]: I1014 13:40:13.844588 4837 generic.go:334] "Generic (PLEG): container finished" podID="aacf282f-f2c7-447d-9e73-98a35898f8df" containerID="05b33cd0dad6c0b28a38fe63bbe124ff65f7ef7f7a91fca2c7315e6440f01501" exitCode=0 Oct 14 13:40:13 crc kubenswrapper[4837]: I1014 13:40:13.844685 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" event={"ID":"aacf282f-f2c7-447d-9e73-98a35898f8df","Type":"ContainerDied","Data":"05b33cd0dad6c0b28a38fe63bbe124ff65f7ef7f7a91fca2c7315e6440f01501"} Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.378675 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.538096 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-secret-0\") pod \"aacf282f-f2c7-447d-9e73-98a35898f8df\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.538239 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72q2g\" (UniqueName: \"kubernetes.io/projected/aacf282f-f2c7-447d-9e73-98a35898f8df-kube-api-access-72q2g\") pod \"aacf282f-f2c7-447d-9e73-98a35898f8df\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.538473 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-ssh-key\") pod \"aacf282f-f2c7-447d-9e73-98a35898f8df\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.538503 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-inventory\") pod \"aacf282f-f2c7-447d-9e73-98a35898f8df\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.538588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-combined-ca-bundle\") pod \"aacf282f-f2c7-447d-9e73-98a35898f8df\" (UID: \"aacf282f-f2c7-447d-9e73-98a35898f8df\") " Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.547623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aacf282f-f2c7-447d-9e73-98a35898f8df-kube-api-access-72q2g" (OuterVolumeSpecName: "kube-api-access-72q2g") pod "aacf282f-f2c7-447d-9e73-98a35898f8df" (UID: "aacf282f-f2c7-447d-9e73-98a35898f8df"). InnerVolumeSpecName "kube-api-access-72q2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.561256 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "aacf282f-f2c7-447d-9e73-98a35898f8df" (UID: "aacf282f-f2c7-447d-9e73-98a35898f8df"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.571251 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "aacf282f-f2c7-447d-9e73-98a35898f8df" (UID: "aacf282f-f2c7-447d-9e73-98a35898f8df"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.571525 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-inventory" (OuterVolumeSpecName: "inventory") pod "aacf282f-f2c7-447d-9e73-98a35898f8df" (UID: "aacf282f-f2c7-447d-9e73-98a35898f8df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.579590 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aacf282f-f2c7-447d-9e73-98a35898f8df" (UID: "aacf282f-f2c7-447d-9e73-98a35898f8df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.640721 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72q2g\" (UniqueName: \"kubernetes.io/projected/aacf282f-f2c7-447d-9e73-98a35898f8df-kube-api-access-72q2g\") on node \"crc\" DevicePath \"\"" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.640749 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.640758 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.640767 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.640776 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aacf282f-f2c7-447d-9e73-98a35898f8df-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.872983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" event={"ID":"aacf282f-f2c7-447d-9e73-98a35898f8df","Type":"ContainerDied","Data":"eef45e922a9c90babb38ea58c297fbbd7c82ddde8e279867cf13d08c6726b7d9"} Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.873053 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eef45e922a9c90babb38ea58c297fbbd7c82ddde8e279867cf13d08c6726b7d9" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.873146 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-78vdf" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994064 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f"] Oct 14 13:40:15 crc kubenswrapper[4837]: E1014 13:40:15.994627 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="registry-server" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994650 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="registry-server" Oct 14 13:40:15 crc kubenswrapper[4837]: E1014 13:40:15.994675 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aacf282f-f2c7-447d-9e73-98a35898f8df" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994684 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aacf282f-f2c7-447d-9e73-98a35898f8df" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 14 13:40:15 crc kubenswrapper[4837]: E1014 13:40:15.994699 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="extract-utilities" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994707 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="extract-utilities" Oct 14 13:40:15 crc kubenswrapper[4837]: E1014 13:40:15.994736 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="extract-content" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994744 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="extract-content" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994938 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f15b18-ec22-451c-a974-5acdf2d683c7" containerName="registry-server" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.994979 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aacf282f-f2c7-447d-9e73-98a35898f8df" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.995778 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.998427 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.998802 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.998833 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.998953 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.999084 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:40:15 crc kubenswrapper[4837]: I1014 13:40:15.999877 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.004185 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f"] Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.017897 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152270 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152364 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152500 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7mx\" (UniqueName: \"kubernetes.io/projected/51ebf601-fdd4-46d5-b68e-97846a7baff5-kube-api-access-ht7mx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152835 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.152896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.153320 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.255513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7mx\" (UniqueName: \"kubernetes.io/projected/51ebf601-fdd4-46d5-b68e-97846a7baff5-kube-api-access-ht7mx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.256408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.257560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.258631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.260093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.260377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.260483 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.260534 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.260581 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.260612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.262233 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.262487 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.268069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.268399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.268514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.268533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.269675 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.280532 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7mx\" (UniqueName: \"kubernetes.io/projected/51ebf601-fdd4-46d5-b68e-97846a7baff5-kube-api-access-ht7mx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9v67f\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.358862 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:40:16 crc kubenswrapper[4837]: I1014 13:40:16.916230 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f"] Oct 14 13:40:17 crc kubenswrapper[4837]: I1014 13:40:17.892997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" event={"ID":"51ebf601-fdd4-46d5-b68e-97846a7baff5","Type":"ContainerStarted","Data":"505a2e91acfd4f3e2333d14c8a054d94d27a3d4d0d5bf495d6d8e3475a1983cf"} Oct 14 13:40:17 crc kubenswrapper[4837]: I1014 13:40:17.893511 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" event={"ID":"51ebf601-fdd4-46d5-b68e-97846a7baff5","Type":"ContainerStarted","Data":"dfc2003dc1fd57a8fd78bb6a788541031cf77dac66dbd974ba2410d87b391353"} Oct 14 13:40:17 crc kubenswrapper[4837]: I1014 13:40:17.915798 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" podStartSLOduration=2.316157851 podStartE2EDuration="2.915774118s" podCreationTimestamp="2025-10-14 13:40:15 +0000 UTC" firstStartedPulling="2025-10-14 13:40:16.930945055 +0000 UTC m=+2354.847944908" lastFinishedPulling="2025-10-14 13:40:17.530561322 +0000 UTC m=+2355.447561175" observedRunningTime="2025-10-14 13:40:17.909038942 +0000 UTC m=+2355.826038765" watchObservedRunningTime="2025-10-14 13:40:17.915774118 +0000 UTC m=+2355.832773971" Oct 14 13:40:22 crc kubenswrapper[4837]: I1014 13:40:22.795948 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:40:22 crc kubenswrapper[4837]: E1014 13:40:22.797027 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:40:36 crc kubenswrapper[4837]: I1014 13:40:36.785282 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:40:36 crc kubenswrapper[4837]: E1014 13:40:36.786234 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:40:51 crc kubenswrapper[4837]: I1014 13:40:51.784382 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:40:51 crc kubenswrapper[4837]: E1014 13:40:51.785595 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:41:03 crc kubenswrapper[4837]: I1014 13:41:03.784812 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:41:03 crc kubenswrapper[4837]: E1014 13:41:03.785639 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:41:15 crc kubenswrapper[4837]: I1014 13:41:15.784411 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:41:15 crc kubenswrapper[4837]: E1014 13:41:15.785038 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:41:27 crc kubenswrapper[4837]: I1014 13:41:27.784925 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:41:27 crc kubenswrapper[4837]: E1014 13:41:27.785738 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:41:38 crc kubenswrapper[4837]: I1014 13:41:38.785000 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:41:38 crc kubenswrapper[4837]: E1014 13:41:38.786050 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:41:52 crc kubenswrapper[4837]: I1014 13:41:52.790701 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:41:52 crc kubenswrapper[4837]: E1014 13:41:52.791706 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:42:04 crc kubenswrapper[4837]: I1014 13:42:04.785885 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:42:04 crc kubenswrapper[4837]: E1014 13:42:04.786916 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:42:16 crc kubenswrapper[4837]: I1014 13:42:16.784896 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:42:16 crc kubenswrapper[4837]: E1014 13:42:16.786102 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.622272 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvhrc"] Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.624266 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.640005 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvhrc"] Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.813902 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmnm8\" (UniqueName: \"kubernetes.io/projected/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-kube-api-access-jmnm8\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.813962 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-utilities\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.813984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-catalog-content\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.915715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmnm8\" (UniqueName: \"kubernetes.io/projected/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-kube-api-access-jmnm8\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.916072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-utilities\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.916199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-catalog-content\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.916549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-utilities\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.916626 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-catalog-content\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:20 crc kubenswrapper[4837]: I1014 13:42:20.932602 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmnm8\" (UniqueName: \"kubernetes.io/projected/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-kube-api-access-jmnm8\") pod \"redhat-operators-hvhrc\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:21 crc kubenswrapper[4837]: I1014 13:42:21.019758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:21 crc kubenswrapper[4837]: I1014 13:42:21.297007 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvhrc"] Oct 14 13:42:22 crc kubenswrapper[4837]: I1014 13:42:22.146533 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerID="1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c" exitCode=0 Oct 14 13:42:22 crc kubenswrapper[4837]: I1014 13:42:22.146610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerDied","Data":"1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c"} Oct 14 13:42:22 crc kubenswrapper[4837]: I1014 13:42:22.146853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerStarted","Data":"4dff05a29a355ae03cc347bebc7961926c375d832cacfeda5bda7a91ce97987b"} Oct 14 13:42:22 crc kubenswrapper[4837]: I1014 13:42:22.149619 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:42:23 crc kubenswrapper[4837]: I1014 13:42:23.159421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerStarted","Data":"50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584"} Oct 14 13:42:24 crc kubenswrapper[4837]: I1014 13:42:24.174512 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerID="50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584" exitCode=0 Oct 14 13:42:24 crc kubenswrapper[4837]: I1014 13:42:24.174950 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerDied","Data":"50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584"} Oct 14 13:42:25 crc kubenswrapper[4837]: I1014 13:42:25.190253 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerStarted","Data":"0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0"} Oct 14 13:42:25 crc kubenswrapper[4837]: I1014 13:42:25.214590 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvhrc" podStartSLOduration=2.532451912 podStartE2EDuration="5.214573012s" podCreationTimestamp="2025-10-14 13:42:20 +0000 UTC" firstStartedPulling="2025-10-14 13:42:22.149351345 +0000 UTC m=+2480.066351168" lastFinishedPulling="2025-10-14 13:42:24.831472455 +0000 UTC m=+2482.748472268" observedRunningTime="2025-10-14 13:42:25.209285186 +0000 UTC m=+2483.126285009" watchObservedRunningTime="2025-10-14 13:42:25.214573012 +0000 UTC m=+2483.131572825" Oct 14 13:42:31 crc kubenswrapper[4837]: I1014 13:42:31.020355 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:31 crc kubenswrapper[4837]: I1014 13:42:31.020796 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:31 crc kubenswrapper[4837]: I1014 13:42:31.092675 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:31 crc kubenswrapper[4837]: I1014 13:42:31.337452 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:31 crc kubenswrapper[4837]: I1014 13:42:31.414191 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvhrc"] Oct 14 13:42:31 crc kubenswrapper[4837]: I1014 13:42:31.787217 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:42:31 crc kubenswrapper[4837]: E1014 13:42:31.787557 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:42:33 crc kubenswrapper[4837]: I1014 13:42:33.294342 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvhrc" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="registry-server" containerID="cri-o://0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0" gracePeriod=2 Oct 14 13:42:34 crc kubenswrapper[4837]: I1014 13:42:34.850618 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.009839 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-catalog-content\") pod \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.010214 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmnm8\" (UniqueName: \"kubernetes.io/projected/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-kube-api-access-jmnm8\") pod \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.010248 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-utilities\") pod \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\" (UID: \"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed\") " Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.011107 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-utilities" (OuterVolumeSpecName: "utilities") pod "6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" (UID: "6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.019327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-kube-api-access-jmnm8" (OuterVolumeSpecName: "kube-api-access-jmnm8") pod "6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" (UID: "6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed"). InnerVolumeSpecName "kube-api-access-jmnm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.098246 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" (UID: "6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.112763 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.112810 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmnm8\" (UniqueName: \"kubernetes.io/projected/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-kube-api-access-jmnm8\") on node \"crc\" DevicePath \"\"" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.112827 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.325672 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerID="0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0" exitCode=0 Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.325745 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerDied","Data":"0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0"} Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.325807 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvhrc" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.325827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvhrc" event={"ID":"6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed","Type":"ContainerDied","Data":"4dff05a29a355ae03cc347bebc7961926c375d832cacfeda5bda7a91ce97987b"} Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.325897 4837 scope.go:117] "RemoveContainer" containerID="0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.386010 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvhrc"] Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.392758 4837 scope.go:117] "RemoveContainer" containerID="50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.407492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvhrc"] Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.433663 4837 scope.go:117] "RemoveContainer" containerID="1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.466276 4837 scope.go:117] "RemoveContainer" containerID="0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0" Oct 14 13:42:35 crc kubenswrapper[4837]: E1014 13:42:35.466761 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0\": container with ID starting with 0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0 not found: ID does not exist" containerID="0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.467303 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0"} err="failed to get container status \"0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0\": rpc error: code = NotFound desc = could not find container \"0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0\": container with ID starting with 0a6a66f9ee55cd321f86273ea1d149b9b7fd9a9dfbf147df62f78b329b1e8ca0 not found: ID does not exist" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.467426 4837 scope.go:117] "RemoveContainer" containerID="50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584" Oct 14 13:42:35 crc kubenswrapper[4837]: E1014 13:42:35.470367 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584\": container with ID starting with 50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584 not found: ID does not exist" containerID="50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.470540 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584"} err="failed to get container status \"50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584\": rpc error: code = NotFound desc = could not find container \"50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584\": container with ID starting with 50d22480135bff6da838f69ece7666f83a4749acb6fb1d0c5de774fc9a55b584 not found: ID does not exist" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.470717 4837 scope.go:117] "RemoveContainer" containerID="1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c" Oct 14 13:42:35 crc kubenswrapper[4837]: E1014 13:42:35.472340 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c\": container with ID starting with 1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c not found: ID does not exist" containerID="1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c" Oct 14 13:42:35 crc kubenswrapper[4837]: I1014 13:42:35.472415 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c"} err="failed to get container status \"1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c\": rpc error: code = NotFound desc = could not find container \"1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c\": container with ID starting with 1e1fce360a9cc1aa082ef04389aad15c04bf5f04a85e91e5fa225dae3aa9d27c not found: ID does not exist" Oct 14 13:42:36 crc kubenswrapper[4837]: I1014 13:42:36.807089 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" path="/var/lib/kubelet/pods/6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed/volumes" Oct 14 13:42:42 crc kubenswrapper[4837]: I1014 13:42:42.790276 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:42:43 crc kubenswrapper[4837]: I1014 13:42:43.427360 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"16f108fe71a51812ebae976a48c956d270ab585893aa3a25beab8e26a9f907af"} Oct 14 13:43:49 crc kubenswrapper[4837]: E1014 13:43:49.688773 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ebf601_fdd4_46d5_b68e_97846a7baff5.slice/crio-505a2e91acfd4f3e2333d14c8a054d94d27a3d4d0d5bf495d6d8e3475a1983cf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ebf601_fdd4_46d5_b68e_97846a7baff5.slice/crio-conmon-505a2e91acfd4f3e2333d14c8a054d94d27a3d4d0d5bf495d6d8e3475a1983cf.scope\": RecentStats: unable to find data in memory cache]" Oct 14 13:43:50 crc kubenswrapper[4837]: I1014 13:43:50.182402 4837 generic.go:334] "Generic (PLEG): container finished" podID="51ebf601-fdd4-46d5-b68e-97846a7baff5" containerID="505a2e91acfd4f3e2333d14c8a054d94d27a3d4d0d5bf495d6d8e3475a1983cf" exitCode=0 Oct 14 13:43:50 crc kubenswrapper[4837]: I1014 13:43:50.182942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" event={"ID":"51ebf601-fdd4-46d5-b68e-97846a7baff5","Type":"ContainerDied","Data":"505a2e91acfd4f3e2333d14c8a054d94d27a3d4d0d5bf495d6d8e3475a1983cf"} Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.787826 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821268 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-1\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821328 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-ssh-key\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821352 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-0\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-1\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821437 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht7mx\" (UniqueName: \"kubernetes.io/projected/51ebf601-fdd4-46d5-b68e-97846a7baff5-kube-api-access-ht7mx\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-combined-ca-bundle\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821538 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-extra-config-0\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-0\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.821632 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-inventory\") pod \"51ebf601-fdd4-46d5-b68e-97846a7baff5\" (UID: \"51ebf601-fdd4-46d5-b68e-97846a7baff5\") " Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.838227 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ebf601-fdd4-46d5-b68e-97846a7baff5-kube-api-access-ht7mx" (OuterVolumeSpecName: "kube-api-access-ht7mx") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "kube-api-access-ht7mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.843989 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.869614 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.869785 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-inventory" (OuterVolumeSpecName: "inventory") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.871781 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.881012 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.886500 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.888029 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.890710 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "51ebf601-fdd4-46d5-b68e-97846a7baff5" (UID: "51ebf601-fdd4-46d5-b68e-97846a7baff5"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923445 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923484 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923494 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923503 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923511 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923520 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht7mx\" (UniqueName: \"kubernetes.io/projected/51ebf601-fdd4-46d5-b68e-97846a7baff5-kube-api-access-ht7mx\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923542 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923552 4837 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:51 crc kubenswrapper[4837]: I1014 13:43:51.923562 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/51ebf601-fdd4-46d5-b68e-97846a7baff5-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.206454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" event={"ID":"51ebf601-fdd4-46d5-b68e-97846a7baff5","Type":"ContainerDied","Data":"dfc2003dc1fd57a8fd78bb6a788541031cf77dac66dbd974ba2410d87b391353"} Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.206493 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc2003dc1fd57a8fd78bb6a788541031cf77dac66dbd974ba2410d87b391353" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.206545 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9v67f" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.313523 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn"] Oct 14 13:43:52 crc kubenswrapper[4837]: E1014 13:43:52.313923 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="extract-utilities" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.313943 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="extract-utilities" Oct 14 13:43:52 crc kubenswrapper[4837]: E1014 13:43:52.313967 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="registry-server" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.313978 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="registry-server" Oct 14 13:43:52 crc kubenswrapper[4837]: E1014 13:43:52.314001 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="extract-content" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.314007 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="extract-content" Oct 14 13:43:52 crc kubenswrapper[4837]: E1014 13:43:52.314026 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ebf601-fdd4-46d5-b68e-97846a7baff5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.314032 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ebf601-fdd4-46d5-b68e-97846a7baff5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.314227 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ebf601-fdd4-46d5-b68e-97846a7baff5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.314239 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e35f8b4-3fe2-4efa-bd62-b2f3baef32ed" containerName="registry-server" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.314848 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.318406 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.318459 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.318535 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.318678 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-stsgq" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.318764 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329151 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329342 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqx4f\" (UniqueName: \"kubernetes.io/projected/aa5f8d90-d124-49cd-ac34-f24b91f0a457-kube-api-access-vqx4f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329429 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329507 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.329623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.336740 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn"] Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.431875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqx4f\" (UniqueName: \"kubernetes.io/projected/aa5f8d90-d124-49cd-ac34-f24b91f0a457-kube-api-access-vqx4f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.432345 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.432415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.432469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.432518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.432566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.432597 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.443076 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.443287 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.443420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.443436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.445598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.446756 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.449526 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqx4f\" (UniqueName: \"kubernetes.io/projected/aa5f8d90-d124-49cd-ac34-f24b91f0a457-kube-api-access-vqx4f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:52 crc kubenswrapper[4837]: I1014 13:43:52.653524 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:43:53 crc kubenswrapper[4837]: I1014 13:43:53.311639 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn"] Oct 14 13:43:54 crc kubenswrapper[4837]: I1014 13:43:54.225015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" event={"ID":"aa5f8d90-d124-49cd-ac34-f24b91f0a457","Type":"ContainerStarted","Data":"71c0097883189a95d5fdeb1fb114832ad148fff8bef4f126531a1da45efcfd1a"} Oct 14 13:43:54 crc kubenswrapper[4837]: I1014 13:43:54.225603 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" event={"ID":"aa5f8d90-d124-49cd-ac34-f24b91f0a457","Type":"ContainerStarted","Data":"6b058abce843edd6ab1b8a0215a1a699586a963b509d6cfdd9191f8f4577959a"} Oct 14 13:43:54 crc kubenswrapper[4837]: I1014 13:43:54.263755 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" podStartSLOduration=1.79180735 podStartE2EDuration="2.263730824s" podCreationTimestamp="2025-10-14 13:43:52 +0000 UTC" firstStartedPulling="2025-10-14 13:43:53.315790124 +0000 UTC m=+2571.232789927" lastFinishedPulling="2025-10-14 13:43:53.787713558 +0000 UTC m=+2571.704713401" observedRunningTime="2025-10-14 13:43:54.250688009 +0000 UTC m=+2572.167687822" watchObservedRunningTime="2025-10-14 13:43:54.263730824 +0000 UTC m=+2572.180730677" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.165960 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m"] Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.167950 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.170352 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.171227 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.181857 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m"] Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.211102 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zz5\" (UniqueName: \"kubernetes.io/projected/56754302-a266-40d0-b5df-36a62b08b80f-kube-api-access-t2zz5\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.211142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56754302-a266-40d0-b5df-36a62b08b80f-secret-volume\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.211474 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56754302-a266-40d0-b5df-36a62b08b80f-config-volume\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.313235 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zz5\" (UniqueName: \"kubernetes.io/projected/56754302-a266-40d0-b5df-36a62b08b80f-kube-api-access-t2zz5\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.313494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56754302-a266-40d0-b5df-36a62b08b80f-secret-volume\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.313710 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56754302-a266-40d0-b5df-36a62b08b80f-config-volume\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.315549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56754302-a266-40d0-b5df-36a62b08b80f-config-volume\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.318852 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56754302-a266-40d0-b5df-36a62b08b80f-secret-volume\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.336898 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zz5\" (UniqueName: \"kubernetes.io/projected/56754302-a266-40d0-b5df-36a62b08b80f-kube-api-access-t2zz5\") pod \"collect-profiles-29340825-9fh4m\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.496805 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:00 crc kubenswrapper[4837]: I1014 13:45:00.969928 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m"] Oct 14 13:45:01 crc kubenswrapper[4837]: I1014 13:45:01.954258 4837 generic.go:334] "Generic (PLEG): container finished" podID="56754302-a266-40d0-b5df-36a62b08b80f" containerID="6ce069b0481671a9ef9e01bd19f63cfdb1d25c82f69f0f37be8b886941336d91" exitCode=0 Oct 14 13:45:01 crc kubenswrapper[4837]: I1014 13:45:01.954309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" event={"ID":"56754302-a266-40d0-b5df-36a62b08b80f","Type":"ContainerDied","Data":"6ce069b0481671a9ef9e01bd19f63cfdb1d25c82f69f0f37be8b886941336d91"} Oct 14 13:45:01 crc kubenswrapper[4837]: I1014 13:45:01.955017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" event={"ID":"56754302-a266-40d0-b5df-36a62b08b80f","Type":"ContainerStarted","Data":"720e9dfe1ae4fff0ab486d0fcd09d4c52bfa8a01c318621c0cf73b5e227f43e8"} Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.318351 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.369330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56754302-a266-40d0-b5df-36a62b08b80f-secret-volume\") pod \"56754302-a266-40d0-b5df-36a62b08b80f\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.369404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2zz5\" (UniqueName: \"kubernetes.io/projected/56754302-a266-40d0-b5df-36a62b08b80f-kube-api-access-t2zz5\") pod \"56754302-a266-40d0-b5df-36a62b08b80f\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.369423 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56754302-a266-40d0-b5df-36a62b08b80f-config-volume\") pod \"56754302-a266-40d0-b5df-36a62b08b80f\" (UID: \"56754302-a266-40d0-b5df-36a62b08b80f\") " Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.369914 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56754302-a266-40d0-b5df-36a62b08b80f-config-volume" (OuterVolumeSpecName: "config-volume") pod "56754302-a266-40d0-b5df-36a62b08b80f" (UID: "56754302-a266-40d0-b5df-36a62b08b80f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.375409 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56754302-a266-40d0-b5df-36a62b08b80f-kube-api-access-t2zz5" (OuterVolumeSpecName: "kube-api-access-t2zz5") pod "56754302-a266-40d0-b5df-36a62b08b80f" (UID: "56754302-a266-40d0-b5df-36a62b08b80f"). InnerVolumeSpecName "kube-api-access-t2zz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.375418 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56754302-a266-40d0-b5df-36a62b08b80f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "56754302-a266-40d0-b5df-36a62b08b80f" (UID: "56754302-a266-40d0-b5df-36a62b08b80f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.471045 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56754302-a266-40d0-b5df-36a62b08b80f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.471084 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2zz5\" (UniqueName: \"kubernetes.io/projected/56754302-a266-40d0-b5df-36a62b08b80f-kube-api-access-t2zz5\") on node \"crc\" DevicePath \"\"" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.471093 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56754302-a266-40d0-b5df-36a62b08b80f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.983315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" event={"ID":"56754302-a266-40d0-b5df-36a62b08b80f","Type":"ContainerDied","Data":"720e9dfe1ae4fff0ab486d0fcd09d4c52bfa8a01c318621c0cf73b5e227f43e8"} Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.983397 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="720e9dfe1ae4fff0ab486d0fcd09d4c52bfa8a01c318621c0cf73b5e227f43e8" Oct 14 13:45:03 crc kubenswrapper[4837]: I1014 13:45:03.983400 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340825-9fh4m" Oct 14 13:45:04 crc kubenswrapper[4837]: I1014 13:45:04.398307 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z"] Oct 14 13:45:04 crc kubenswrapper[4837]: I1014 13:45:04.409972 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340780-cjt2z"] Oct 14 13:45:04 crc kubenswrapper[4837]: I1014 13:45:04.802687 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71c72b3-1e65-4ba0-b0c9-e1faaed535a5" path="/var/lib/kubelet/pods/f71c72b3-1e65-4ba0-b0c9-e1faaed535a5/volumes" Oct 14 13:45:07 crc kubenswrapper[4837]: I1014 13:45:07.736332 4837 scope.go:117] "RemoveContainer" containerID="c0160032115176d3853cc8ec051c96849cb3424aee5ed74716737b2714a99d57" Oct 14 13:45:11 crc kubenswrapper[4837]: I1014 13:45:11.140843 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:45:11 crc kubenswrapper[4837]: I1014 13:45:11.141760 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.684407 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qvs2d"] Oct 14 13:45:40 crc kubenswrapper[4837]: E1014 13:45:40.686847 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56754302-a266-40d0-b5df-36a62b08b80f" containerName="collect-profiles" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.686878 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="56754302-a266-40d0-b5df-36a62b08b80f" containerName="collect-profiles" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.687069 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="56754302-a266-40d0-b5df-36a62b08b80f" containerName="collect-profiles" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.689487 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.702900 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvs2d"] Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.874221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-utilities\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.874277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjh99\" (UniqueName: \"kubernetes.io/projected/55c99766-665a-4f8c-a652-360c455bfd5b-kube-api-access-fjh99\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.874352 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-catalog-content\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.976591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-utilities\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.976652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjh99\" (UniqueName: \"kubernetes.io/projected/55c99766-665a-4f8c-a652-360c455bfd5b-kube-api-access-fjh99\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.976725 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-catalog-content\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.977467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-utilities\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:40 crc kubenswrapper[4837]: I1014 13:45:40.977882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-catalog-content\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:41 crc kubenswrapper[4837]: I1014 13:45:40.997367 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjh99\" (UniqueName: \"kubernetes.io/projected/55c99766-665a-4f8c-a652-360c455bfd5b-kube-api-access-fjh99\") pod \"redhat-marketplace-qvs2d\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:41 crc kubenswrapper[4837]: I1014 13:45:41.016127 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:41 crc kubenswrapper[4837]: I1014 13:45:41.140182 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:45:41 crc kubenswrapper[4837]: I1014 13:45:41.140495 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:45:41 crc kubenswrapper[4837]: I1014 13:45:41.433715 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvs2d"] Oct 14 13:45:42 crc kubenswrapper[4837]: I1014 13:45:42.357628 4837 generic.go:334] "Generic (PLEG): container finished" podID="55c99766-665a-4f8c-a652-360c455bfd5b" containerID="ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4" exitCode=0 Oct 14 13:45:42 crc kubenswrapper[4837]: I1014 13:45:42.357677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvs2d" event={"ID":"55c99766-665a-4f8c-a652-360c455bfd5b","Type":"ContainerDied","Data":"ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4"} Oct 14 13:45:42 crc kubenswrapper[4837]: I1014 13:45:42.357707 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvs2d" event={"ID":"55c99766-665a-4f8c-a652-360c455bfd5b","Type":"ContainerStarted","Data":"fcc01f60ab04441978552c83edce6a31867b5db9ea817364a030ed4e2fb9a1b0"} Oct 14 13:45:44 crc kubenswrapper[4837]: I1014 13:45:44.375387 4837 generic.go:334] "Generic (PLEG): container finished" podID="55c99766-665a-4f8c-a652-360c455bfd5b" containerID="5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a" exitCode=0 Oct 14 13:45:44 crc kubenswrapper[4837]: I1014 13:45:44.375430 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvs2d" event={"ID":"55c99766-665a-4f8c-a652-360c455bfd5b","Type":"ContainerDied","Data":"5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a"} Oct 14 13:45:45 crc kubenswrapper[4837]: I1014 13:45:45.393230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvs2d" event={"ID":"55c99766-665a-4f8c-a652-360c455bfd5b","Type":"ContainerStarted","Data":"7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415"} Oct 14 13:45:45 crc kubenswrapper[4837]: I1014 13:45:45.412964 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qvs2d" podStartSLOduration=2.838585485 podStartE2EDuration="5.412941952s" podCreationTimestamp="2025-10-14 13:45:40 +0000 UTC" firstStartedPulling="2025-10-14 13:45:42.361313885 +0000 UTC m=+2680.278313698" lastFinishedPulling="2025-10-14 13:45:44.935670312 +0000 UTC m=+2682.852670165" observedRunningTime="2025-10-14 13:45:45.410403784 +0000 UTC m=+2683.327403617" watchObservedRunningTime="2025-10-14 13:45:45.412941952 +0000 UTC m=+2683.329941775" Oct 14 13:45:51 crc kubenswrapper[4837]: I1014 13:45:51.017539 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:51 crc kubenswrapper[4837]: I1014 13:45:51.017907 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:51 crc kubenswrapper[4837]: I1014 13:45:51.067302 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:51 crc kubenswrapper[4837]: I1014 13:45:51.498555 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:51 crc kubenswrapper[4837]: I1014 13:45:51.568032 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvs2d"] Oct 14 13:45:53 crc kubenswrapper[4837]: I1014 13:45:53.472676 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qvs2d" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="registry-server" containerID="cri-o://7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415" gracePeriod=2 Oct 14 13:45:53 crc kubenswrapper[4837]: I1014 13:45:53.912001 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.019864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-catalog-content\") pod \"55c99766-665a-4f8c-a652-360c455bfd5b\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.020030 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjh99\" (UniqueName: \"kubernetes.io/projected/55c99766-665a-4f8c-a652-360c455bfd5b-kube-api-access-fjh99\") pod \"55c99766-665a-4f8c-a652-360c455bfd5b\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.020127 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-utilities\") pod \"55c99766-665a-4f8c-a652-360c455bfd5b\" (UID: \"55c99766-665a-4f8c-a652-360c455bfd5b\") " Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.021016 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-utilities" (OuterVolumeSpecName: "utilities") pod "55c99766-665a-4f8c-a652-360c455bfd5b" (UID: "55c99766-665a-4f8c-a652-360c455bfd5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.032323 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c99766-665a-4f8c-a652-360c455bfd5b-kube-api-access-fjh99" (OuterVolumeSpecName: "kube-api-access-fjh99") pod "55c99766-665a-4f8c-a652-360c455bfd5b" (UID: "55c99766-665a-4f8c-a652-360c455bfd5b"). InnerVolumeSpecName "kube-api-access-fjh99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.039899 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55c99766-665a-4f8c-a652-360c455bfd5b" (UID: "55c99766-665a-4f8c-a652-360c455bfd5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.122085 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.122127 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c99766-665a-4f8c-a652-360c455bfd5b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.122140 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjh99\" (UniqueName: \"kubernetes.io/projected/55c99766-665a-4f8c-a652-360c455bfd5b-kube-api-access-fjh99\") on node \"crc\" DevicePath \"\"" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.485335 4837 generic.go:334] "Generic (PLEG): container finished" podID="55c99766-665a-4f8c-a652-360c455bfd5b" containerID="7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415" exitCode=0 Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.485381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvs2d" event={"ID":"55c99766-665a-4f8c-a652-360c455bfd5b","Type":"ContainerDied","Data":"7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415"} Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.485409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvs2d" event={"ID":"55c99766-665a-4f8c-a652-360c455bfd5b","Type":"ContainerDied","Data":"fcc01f60ab04441978552c83edce6a31867b5db9ea817364a030ed4e2fb9a1b0"} Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.485429 4837 scope.go:117] "RemoveContainer" containerID="7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.485480 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvs2d" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.506611 4837 scope.go:117] "RemoveContainer" containerID="5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.525898 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvs2d"] Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.532728 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvs2d"] Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.545686 4837 scope.go:117] "RemoveContainer" containerID="ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.580722 4837 scope.go:117] "RemoveContainer" containerID="7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415" Oct 14 13:45:54 crc kubenswrapper[4837]: E1014 13:45:54.581269 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415\": container with ID starting with 7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415 not found: ID does not exist" containerID="7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.581310 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415"} err="failed to get container status \"7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415\": rpc error: code = NotFound desc = could not find container \"7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415\": container with ID starting with 7e9db6abde2c2844d5d33bc9316cb31bb0b0d21b38f1f09e3991e0b4e5f88415 not found: ID does not exist" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.581337 4837 scope.go:117] "RemoveContainer" containerID="5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a" Oct 14 13:45:54 crc kubenswrapper[4837]: E1014 13:45:54.581671 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a\": container with ID starting with 5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a not found: ID does not exist" containerID="5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.581714 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a"} err="failed to get container status \"5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a\": rpc error: code = NotFound desc = could not find container \"5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a\": container with ID starting with 5845ef4cd766b095011297b2e1ec98d503184178dd6313307a7588fba1829d0a not found: ID does not exist" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.581746 4837 scope.go:117] "RemoveContainer" containerID="ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4" Oct 14 13:45:54 crc kubenswrapper[4837]: E1014 13:45:54.582126 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4\": container with ID starting with ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4 not found: ID does not exist" containerID="ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.582308 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4"} err="failed to get container status \"ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4\": rpc error: code = NotFound desc = could not find container \"ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4\": container with ID starting with ae6fd0d49eaecd1e50475869bd04438db1823192091824c3d2ca8ab23ae266e4 not found: ID does not exist" Oct 14 13:45:54 crc kubenswrapper[4837]: I1014 13:45:54.799239 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" path="/var/lib/kubelet/pods/55c99766-665a-4f8c-a652-360c455bfd5b/volumes" Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.140484 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.141272 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.141341 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.142421 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16f108fe71a51812ebae976a48c956d270ab585893aa3a25beab8e26a9f907af"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.142525 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://16f108fe71a51812ebae976a48c956d270ab585893aa3a25beab8e26a9f907af" gracePeriod=600 Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.648691 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="16f108fe71a51812ebae976a48c956d270ab585893aa3a25beab8e26a9f907af" exitCode=0 Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.648769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"16f108fe71a51812ebae976a48c956d270ab585893aa3a25beab8e26a9f907af"} Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.649043 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de"} Oct 14 13:46:11 crc kubenswrapper[4837]: I1014 13:46:11.649074 4837 scope.go:117] "RemoveContainer" containerID="67995ddd4a96045876e7888516dbb02baaef4838e10da6df7baae4460058c9f0" Oct 14 13:46:31 crc kubenswrapper[4837]: I1014 13:46:31.853922 4837 generic.go:334] "Generic (PLEG): container finished" podID="aa5f8d90-d124-49cd-ac34-f24b91f0a457" containerID="71c0097883189a95d5fdeb1fb114832ad148fff8bef4f126531a1da45efcfd1a" exitCode=0 Oct 14 13:46:31 crc kubenswrapper[4837]: I1014 13:46:31.853995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" event={"ID":"aa5f8d90-d124-49cd-ac34-f24b91f0a457","Type":"ContainerDied","Data":"71c0097883189a95d5fdeb1fb114832ad148fff8bef4f126531a1da45efcfd1a"} Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.346624 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.428993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-2\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.429113 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ssh-key\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.429189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-0\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.429207 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-1\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.429244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-inventory\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.429273 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-telemetry-combined-ca-bundle\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.429396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqx4f\" (UniqueName: \"kubernetes.io/projected/aa5f8d90-d124-49cd-ac34-f24b91f0a457-kube-api-access-vqx4f\") pod \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\" (UID: \"aa5f8d90-d124-49cd-ac34-f24b91f0a457\") " Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.434498 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.434691 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5f8d90-d124-49cd-ac34-f24b91f0a457-kube-api-access-vqx4f" (OuterVolumeSpecName: "kube-api-access-vqx4f") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "kube-api-access-vqx4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.457232 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.462121 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-inventory" (OuterVolumeSpecName: "inventory") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.465263 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.465650 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.484580 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "aa5f8d90-d124-49cd-ac34-f24b91f0a457" (UID: "aa5f8d90-d124-49cd-ac34-f24b91f0a457"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531849 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqx4f\" (UniqueName: \"kubernetes.io/projected/aa5f8d90-d124-49cd-ac34-f24b91f0a457-kube-api-access-vqx4f\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531891 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531902 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531919 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531939 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531954 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-inventory\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.531969 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5f8d90-d124-49cd-ac34-f24b91f0a457-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.880140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" event={"ID":"aa5f8d90-d124-49cd-ac34-f24b91f0a457","Type":"ContainerDied","Data":"6b058abce843edd6ab1b8a0215a1a699586a963b509d6cfdd9191f8f4577959a"} Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.880257 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b058abce843edd6ab1b8a0215a1a699586a963b509d6cfdd9191f8f4577959a" Oct 14 13:46:33 crc kubenswrapper[4837]: I1014 13:46:33.880383 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn" Oct 14 13:46:34 crc kubenswrapper[4837]: E1014 13:46:34.079348 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5f8d90_d124_49cd_ac34_f24b91f0a457.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5f8d90_d124_49cd_ac34_f24b91f0a457.slice/crio-6b058abce843edd6ab1b8a0215a1a699586a963b509d6cfdd9191f8f4577959a\": RecentStats: unable to find data in memory cache]" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.447066 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l7cv7"] Oct 14 13:46:48 crc kubenswrapper[4837]: E1014 13:46:48.448135 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="registry-server" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.448151 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="registry-server" Oct 14 13:46:48 crc kubenswrapper[4837]: E1014 13:46:48.448199 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5f8d90-d124-49cd-ac34-f24b91f0a457" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.448210 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5f8d90-d124-49cd-ac34-f24b91f0a457" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 14 13:46:48 crc kubenswrapper[4837]: E1014 13:46:48.448254 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="extract-content" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.448264 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="extract-content" Oct 14 13:46:48 crc kubenswrapper[4837]: E1014 13:46:48.448278 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="extract-utilities" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.448286 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="extract-utilities" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.448497 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5f8d90-d124-49cd-ac34-f24b91f0a457" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.448524 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c99766-665a-4f8c-a652-360c455bfd5b" containerName="registry-server" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.450410 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.458671 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7cv7"] Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.540301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-catalog-content\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.540773 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4kg\" (UniqueName: \"kubernetes.io/projected/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-kube-api-access-mf4kg\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.540847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-utilities\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.643473 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-catalog-content\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.643591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf4kg\" (UniqueName: \"kubernetes.io/projected/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-kube-api-access-mf4kg\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.643680 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-utilities\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.644097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-catalog-content\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.644420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-utilities\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.665720 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf4kg\" (UniqueName: \"kubernetes.io/projected/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-kube-api-access-mf4kg\") pod \"community-operators-l7cv7\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:48 crc kubenswrapper[4837]: I1014 13:46:48.770560 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:49 crc kubenswrapper[4837]: I1014 13:46:49.336063 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7cv7"] Oct 14 13:46:50 crc kubenswrapper[4837]: I1014 13:46:50.055868 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerID="b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc" exitCode=0 Oct 14 13:46:50 crc kubenswrapper[4837]: I1014 13:46:50.055964 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7cv7" event={"ID":"ebddf228-fe3a-4878-a17f-4e58d9dc5fae","Type":"ContainerDied","Data":"b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc"} Oct 14 13:46:50 crc kubenswrapper[4837]: I1014 13:46:50.056389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7cv7" event={"ID":"ebddf228-fe3a-4878-a17f-4e58d9dc5fae","Type":"ContainerStarted","Data":"dfa375144ee2972a1308d5dd552b8e8cbbe0e54fecee402f1db8bd01d52fc795"} Oct 14 13:46:52 crc kubenswrapper[4837]: I1014 13:46:52.078950 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerID="d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054" exitCode=0 Oct 14 13:46:52 crc kubenswrapper[4837]: I1014 13:46:52.079536 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7cv7" event={"ID":"ebddf228-fe3a-4878-a17f-4e58d9dc5fae","Type":"ContainerDied","Data":"d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054"} Oct 14 13:46:53 crc kubenswrapper[4837]: I1014 13:46:53.088588 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7cv7" event={"ID":"ebddf228-fe3a-4878-a17f-4e58d9dc5fae","Type":"ContainerStarted","Data":"ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f"} Oct 14 13:46:53 crc kubenswrapper[4837]: I1014 13:46:53.111728 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l7cv7" podStartSLOduration=2.596519072 podStartE2EDuration="5.111709542s" podCreationTimestamp="2025-10-14 13:46:48 +0000 UTC" firstStartedPulling="2025-10-14 13:46:50.058626635 +0000 UTC m=+2747.975626448" lastFinishedPulling="2025-10-14 13:46:52.573817085 +0000 UTC m=+2750.490816918" observedRunningTime="2025-10-14 13:46:53.104612758 +0000 UTC m=+2751.021612581" watchObservedRunningTime="2025-10-14 13:46:53.111709542 +0000 UTC m=+2751.028709355" Oct 14 13:46:58 crc kubenswrapper[4837]: I1014 13:46:58.771819 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:58 crc kubenswrapper[4837]: I1014 13:46:58.772129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:58 crc kubenswrapper[4837]: I1014 13:46:58.826701 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:59 crc kubenswrapper[4837]: I1014 13:46:59.227854 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:46:59 crc kubenswrapper[4837]: I1014 13:46:59.297870 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7cv7"] Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.177994 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l7cv7" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="registry-server" containerID="cri-o://ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f" gracePeriod=2 Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.739303 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.818640 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf4kg\" (UniqueName: \"kubernetes.io/projected/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-kube-api-access-mf4kg\") pod \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.818733 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-utilities\") pod \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.818797 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-catalog-content\") pod \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\" (UID: \"ebddf228-fe3a-4878-a17f-4e58d9dc5fae\") " Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.820952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-utilities" (OuterVolumeSpecName: "utilities") pod "ebddf228-fe3a-4878-a17f-4e58d9dc5fae" (UID: "ebddf228-fe3a-4878-a17f-4e58d9dc5fae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.833638 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-kube-api-access-mf4kg" (OuterVolumeSpecName: "kube-api-access-mf4kg") pod "ebddf228-fe3a-4878-a17f-4e58d9dc5fae" (UID: "ebddf228-fe3a-4878-a17f-4e58d9dc5fae"). InnerVolumeSpecName "kube-api-access-mf4kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.921474 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf4kg\" (UniqueName: \"kubernetes.io/projected/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-kube-api-access-mf4kg\") on node \"crc\" DevicePath \"\"" Oct 14 13:47:01 crc kubenswrapper[4837]: I1014 13:47:01.921508 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.167234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebddf228-fe3a-4878-a17f-4e58d9dc5fae" (UID: "ebddf228-fe3a-4878-a17f-4e58d9dc5fae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.190715 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerID="ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f" exitCode=0 Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.190767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7cv7" event={"ID":"ebddf228-fe3a-4878-a17f-4e58d9dc5fae","Type":"ContainerDied","Data":"ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f"} Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.190806 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7cv7" event={"ID":"ebddf228-fe3a-4878-a17f-4e58d9dc5fae","Type":"ContainerDied","Data":"dfa375144ee2972a1308d5dd552b8e8cbbe0e54fecee402f1db8bd01d52fc795"} Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.190828 4837 scope.go:117] "RemoveContainer" containerID="ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.190853 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7cv7" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.227602 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebddf228-fe3a-4878-a17f-4e58d9dc5fae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.228909 4837 scope.go:117] "RemoveContainer" containerID="d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.237038 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7cv7"] Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.245746 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l7cv7"] Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.262904 4837 scope.go:117] "RemoveContainer" containerID="b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.299449 4837 scope.go:117] "RemoveContainer" containerID="ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f" Oct 14 13:47:02 crc kubenswrapper[4837]: E1014 13:47:02.299986 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f\": container with ID starting with ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f not found: ID does not exist" containerID="ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.300150 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f"} err="failed to get container status \"ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f\": rpc error: code = NotFound desc = could not find container \"ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f\": container with ID starting with ab67d4b2f266de190defbeb88315e99349a9dc1f24232e1d0e3212b6b46be75f not found: ID does not exist" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.300306 4837 scope.go:117] "RemoveContainer" containerID="d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054" Oct 14 13:47:02 crc kubenswrapper[4837]: E1014 13:47:02.300909 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054\": container with ID starting with d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054 not found: ID does not exist" containerID="d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.300965 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054"} err="failed to get container status \"d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054\": rpc error: code = NotFound desc = could not find container \"d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054\": container with ID starting with d1a8db68b69df5b8cee4c863fa1252b067748bf7e933a11f8620018ad3c1b054 not found: ID does not exist" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.300996 4837 scope.go:117] "RemoveContainer" containerID="b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc" Oct 14 13:47:02 crc kubenswrapper[4837]: E1014 13:47:02.301469 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc\": container with ID starting with b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc not found: ID does not exist" containerID="b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.301505 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc"} err="failed to get container status \"b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc\": rpc error: code = NotFound desc = could not find container \"b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc\": container with ID starting with b10a1df6408952f2376167002a33d4d9f3a6396ff3b3e38dbaea88536d2578bc not found: ID does not exist" Oct 14 13:47:02 crc kubenswrapper[4837]: I1014 13:47:02.814803 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" path="/var/lib/kubelet/pods/ebddf228-fe3a-4878-a17f-4e58d9dc5fae/volumes" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.471649 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 14 13:47:19 crc kubenswrapper[4837]: E1014 13:47:19.472817 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="registry-server" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.472838 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="registry-server" Oct 14 13:47:19 crc kubenswrapper[4837]: E1014 13:47:19.472869 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="extract-utilities" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.472881 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="extract-utilities" Oct 14 13:47:19 crc kubenswrapper[4837]: E1014 13:47:19.472928 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="extract-content" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.472938 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="extract-content" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.473214 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebddf228-fe3a-4878-a17f-4e58d9dc5fae" containerName="registry-server" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.474063 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.476112 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.476127 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.476130 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.477302 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gbfpm" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.484659 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.594749 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.594839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.594890 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.594967 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.594991 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-config-data\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.595013 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggtp\" (UniqueName: \"kubernetes.io/projected/beac4f98-00d6-438b-86cc-2f85d2ca1f96-kube-api-access-8ggtp\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.595047 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.595064 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.595142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697414 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-config-data\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697883 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggtp\" (UniqueName: \"kubernetes.io/projected/beac4f98-00d6-438b-86cc-2f85d2ca1f96-kube-api-access-8ggtp\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.697993 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.698033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.698441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.698491 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.698640 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.698826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.700491 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-config-data\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.705137 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.705751 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.711933 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.720103 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggtp\" (UniqueName: \"kubernetes.io/projected/beac4f98-00d6-438b-86cc-2f85d2ca1f96-kube-api-access-8ggtp\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.737257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " pod="openstack/tempest-tests-tempest" Oct 14 13:47:19 crc kubenswrapper[4837]: I1014 13:47:19.801114 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 14 13:47:20 crc kubenswrapper[4837]: I1014 13:47:20.213460 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 14 13:47:20 crc kubenswrapper[4837]: I1014 13:47:20.402954 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"beac4f98-00d6-438b-86cc-2f85d2ca1f96","Type":"ContainerStarted","Data":"a14b21bea8a80ed7a7c683d09fdfaa2484e6f0f190cac8f5a02c2de43911e0be"} Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.680688 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g58t9"] Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.686064 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.692946 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g58t9"] Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.856984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-catalog-content\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.857088 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-utilities\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.857565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4whfn\" (UniqueName: \"kubernetes.io/projected/8390c75c-afa2-498e-baea-ebf42594094f-kube-api-access-4whfn\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.959711 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-catalog-content\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.959843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-utilities\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.960451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4whfn\" (UniqueName: \"kubernetes.io/projected/8390c75c-afa2-498e-baea-ebf42594094f-kube-api-access-4whfn\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.960380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-utilities\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.960288 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-catalog-content\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:46 crc kubenswrapper[4837]: I1014 13:47:46.989491 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4whfn\" (UniqueName: \"kubernetes.io/projected/8390c75c-afa2-498e-baea-ebf42594094f-kube-api-access-4whfn\") pod \"certified-operators-g58t9\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:47 crc kubenswrapper[4837]: I1014 13:47:47.050921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:50 crc kubenswrapper[4837]: E1014 13:47:50.116776 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 14 13:47:50 crc kubenswrapper[4837]: E1014 13:47:50.117231 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ggtp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(beac4f98-00d6-438b-86cc-2f85d2ca1f96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 14 13:47:50 crc kubenswrapper[4837]: E1014 13:47:50.118756 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="beac4f98-00d6-438b-86cc-2f85d2ca1f96" Oct 14 13:47:50 crc kubenswrapper[4837]: I1014 13:47:50.559748 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g58t9"] Oct 14 13:47:50 crc kubenswrapper[4837]: I1014 13:47:50.725675 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g58t9" event={"ID":"8390c75c-afa2-498e-baea-ebf42594094f","Type":"ContainerStarted","Data":"769eb373e7bebf5866f2c3d9b2679f4cade8c00e7c0168304da8226ccdb61f34"} Oct 14 13:47:50 crc kubenswrapper[4837]: E1014 13:47:50.727329 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="beac4f98-00d6-438b-86cc-2f85d2ca1f96" Oct 14 13:47:51 crc kubenswrapper[4837]: I1014 13:47:51.734715 4837 generic.go:334] "Generic (PLEG): container finished" podID="8390c75c-afa2-498e-baea-ebf42594094f" containerID="5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c" exitCode=0 Oct 14 13:47:51 crc kubenswrapper[4837]: I1014 13:47:51.734892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g58t9" event={"ID":"8390c75c-afa2-498e-baea-ebf42594094f","Type":"ContainerDied","Data":"5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c"} Oct 14 13:47:51 crc kubenswrapper[4837]: I1014 13:47:51.737013 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:47:53 crc kubenswrapper[4837]: I1014 13:47:53.762140 4837 generic.go:334] "Generic (PLEG): container finished" podID="8390c75c-afa2-498e-baea-ebf42594094f" containerID="5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548" exitCode=0 Oct 14 13:47:53 crc kubenswrapper[4837]: I1014 13:47:53.762617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g58t9" event={"ID":"8390c75c-afa2-498e-baea-ebf42594094f","Type":"ContainerDied","Data":"5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548"} Oct 14 13:47:54 crc kubenswrapper[4837]: I1014 13:47:54.776565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g58t9" event={"ID":"8390c75c-afa2-498e-baea-ebf42594094f","Type":"ContainerStarted","Data":"4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5"} Oct 14 13:47:54 crc kubenswrapper[4837]: I1014 13:47:54.803661 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g58t9" podStartSLOduration=6.096503422 podStartE2EDuration="8.803639937s" podCreationTimestamp="2025-10-14 13:47:46 +0000 UTC" firstStartedPulling="2025-10-14 13:47:51.7365648 +0000 UTC m=+2809.653564633" lastFinishedPulling="2025-10-14 13:47:54.443701325 +0000 UTC m=+2812.360701148" observedRunningTime="2025-10-14 13:47:54.795927807 +0000 UTC m=+2812.712927620" watchObservedRunningTime="2025-10-14 13:47:54.803639937 +0000 UTC m=+2812.720639750" Oct 14 13:47:57 crc kubenswrapper[4837]: I1014 13:47:57.051524 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:57 crc kubenswrapper[4837]: I1014 13:47:57.051980 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:47:57 crc kubenswrapper[4837]: I1014 13:47:57.130361 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:48:03 crc kubenswrapper[4837]: I1014 13:48:03.871685 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"beac4f98-00d6-438b-86cc-2f85d2ca1f96","Type":"ContainerStarted","Data":"94d1f8c63ceef21548fb42d20c1501934d8535273189c35a598ba5fe1716b8da"} Oct 14 13:48:03 crc kubenswrapper[4837]: I1014 13:48:03.903815 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.638219239 podStartE2EDuration="45.903780889s" podCreationTimestamp="2025-10-14 13:47:18 +0000 UTC" firstStartedPulling="2025-10-14 13:47:20.218215101 +0000 UTC m=+2778.135214914" lastFinishedPulling="2025-10-14 13:48:02.483776721 +0000 UTC m=+2820.400776564" observedRunningTime="2025-10-14 13:48:03.895369691 +0000 UTC m=+2821.812369534" watchObservedRunningTime="2025-10-14 13:48:03.903780889 +0000 UTC m=+2821.820780742" Oct 14 13:48:07 crc kubenswrapper[4837]: I1014 13:48:07.116362 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:48:07 crc kubenswrapper[4837]: I1014 13:48:07.171992 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g58t9"] Oct 14 13:48:07 crc kubenswrapper[4837]: I1014 13:48:07.911770 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g58t9" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="registry-server" containerID="cri-o://4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5" gracePeriod=2 Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.360673 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.518813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-catalog-content\") pod \"8390c75c-afa2-498e-baea-ebf42594094f\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.518892 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4whfn\" (UniqueName: \"kubernetes.io/projected/8390c75c-afa2-498e-baea-ebf42594094f-kube-api-access-4whfn\") pod \"8390c75c-afa2-498e-baea-ebf42594094f\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.519009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-utilities\") pod \"8390c75c-afa2-498e-baea-ebf42594094f\" (UID: \"8390c75c-afa2-498e-baea-ebf42594094f\") " Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.520513 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-utilities" (OuterVolumeSpecName: "utilities") pod "8390c75c-afa2-498e-baea-ebf42594094f" (UID: "8390c75c-afa2-498e-baea-ebf42594094f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.525844 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8390c75c-afa2-498e-baea-ebf42594094f-kube-api-access-4whfn" (OuterVolumeSpecName: "kube-api-access-4whfn") pod "8390c75c-afa2-498e-baea-ebf42594094f" (UID: "8390c75c-afa2-498e-baea-ebf42594094f"). InnerVolumeSpecName "kube-api-access-4whfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.573186 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8390c75c-afa2-498e-baea-ebf42594094f" (UID: "8390c75c-afa2-498e-baea-ebf42594094f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.621440 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.621474 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8390c75c-afa2-498e-baea-ebf42594094f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.621487 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4whfn\" (UniqueName: \"kubernetes.io/projected/8390c75c-afa2-498e-baea-ebf42594094f-kube-api-access-4whfn\") on node \"crc\" DevicePath \"\"" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.922974 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g58t9" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.922988 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g58t9" event={"ID":"8390c75c-afa2-498e-baea-ebf42594094f","Type":"ContainerDied","Data":"4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5"} Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.923037 4837 scope.go:117] "RemoveContainer" containerID="4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.922842 4837 generic.go:334] "Generic (PLEG): container finished" podID="8390c75c-afa2-498e-baea-ebf42594094f" containerID="4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5" exitCode=0 Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.923526 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g58t9" event={"ID":"8390c75c-afa2-498e-baea-ebf42594094f","Type":"ContainerDied","Data":"769eb373e7bebf5866f2c3d9b2679f4cade8c00e7c0168304da8226ccdb61f34"} Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.954660 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g58t9"] Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.957410 4837 scope.go:117] "RemoveContainer" containerID="5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548" Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.963392 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g58t9"] Oct 14 13:48:08 crc kubenswrapper[4837]: I1014 13:48:08.987135 4837 scope.go:117] "RemoveContainer" containerID="5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c" Oct 14 13:48:09 crc kubenswrapper[4837]: I1014 13:48:09.048499 4837 scope.go:117] "RemoveContainer" containerID="4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5" Oct 14 13:48:09 crc kubenswrapper[4837]: E1014 13:48:09.049000 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5\": container with ID starting with 4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5 not found: ID does not exist" containerID="4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5" Oct 14 13:48:09 crc kubenswrapper[4837]: I1014 13:48:09.049038 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5"} err="failed to get container status \"4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5\": rpc error: code = NotFound desc = could not find container \"4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5\": container with ID starting with 4ce6e4b902d576cccdfb19b7e069ec0053282967435d60246b126acd3b9a19e5 not found: ID does not exist" Oct 14 13:48:09 crc kubenswrapper[4837]: I1014 13:48:09.049064 4837 scope.go:117] "RemoveContainer" containerID="5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548" Oct 14 13:48:09 crc kubenswrapper[4837]: E1014 13:48:09.049429 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548\": container with ID starting with 5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548 not found: ID does not exist" containerID="5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548" Oct 14 13:48:09 crc kubenswrapper[4837]: I1014 13:48:09.049456 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548"} err="failed to get container status \"5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548\": rpc error: code = NotFound desc = could not find container \"5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548\": container with ID starting with 5c401575840e390edefbcb4fa495318a961f32cdf68d268333e5ba487b9de548 not found: ID does not exist" Oct 14 13:48:09 crc kubenswrapper[4837]: I1014 13:48:09.049474 4837 scope.go:117] "RemoveContainer" containerID="5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c" Oct 14 13:48:09 crc kubenswrapper[4837]: E1014 13:48:09.049868 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c\": container with ID starting with 5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c not found: ID does not exist" containerID="5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c" Oct 14 13:48:09 crc kubenswrapper[4837]: I1014 13:48:09.049967 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c"} err="failed to get container status \"5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c\": rpc error: code = NotFound desc = could not find container \"5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c\": container with ID starting with 5999f4539b86342886c9098d19517f3794d015e0286881d1ff275b9fe6aca49c not found: ID does not exist" Oct 14 13:48:10 crc kubenswrapper[4837]: I1014 13:48:10.824285 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8390c75c-afa2-498e-baea-ebf42594094f" path="/var/lib/kubelet/pods/8390c75c-afa2-498e-baea-ebf42594094f/volumes" Oct 14 13:48:11 crc kubenswrapper[4837]: I1014 13:48:11.140300 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:48:11 crc kubenswrapper[4837]: I1014 13:48:11.140399 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:48:41 crc kubenswrapper[4837]: I1014 13:48:41.140825 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:48:41 crc kubenswrapper[4837]: I1014 13:48:41.142452 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.140119 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.140688 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.140739 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.141644 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.141697 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" gracePeriod=600 Oct 14 13:49:11 crc kubenswrapper[4837]: E1014 13:49:11.264013 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.559673 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" exitCode=0 Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.559724 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de"} Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.559761 4837 scope.go:117] "RemoveContainer" containerID="16f108fe71a51812ebae976a48c956d270ab585893aa3a25beab8e26a9f907af" Oct 14 13:49:11 crc kubenswrapper[4837]: I1014 13:49:11.560358 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:49:11 crc kubenswrapper[4837]: E1014 13:49:11.560601 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:49:26 crc kubenswrapper[4837]: I1014 13:49:26.784812 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:49:26 crc kubenswrapper[4837]: E1014 13:49:26.786599 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:49:38 crc kubenswrapper[4837]: I1014 13:49:38.784443 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:49:38 crc kubenswrapper[4837]: E1014 13:49:38.786182 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:49:52 crc kubenswrapper[4837]: I1014 13:49:52.798438 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:49:52 crc kubenswrapper[4837]: E1014 13:49:52.799550 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:50:03 crc kubenswrapper[4837]: I1014 13:50:03.784058 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:50:03 crc kubenswrapper[4837]: E1014 13:50:03.784807 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:50:15 crc kubenswrapper[4837]: I1014 13:50:15.784067 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:50:15 crc kubenswrapper[4837]: E1014 13:50:15.784719 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:50:26 crc kubenswrapper[4837]: I1014 13:50:26.784978 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:50:26 crc kubenswrapper[4837]: E1014 13:50:26.785794 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:50:39 crc kubenswrapper[4837]: I1014 13:50:39.785315 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:50:39 crc kubenswrapper[4837]: E1014 13:50:39.786319 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:50:54 crc kubenswrapper[4837]: I1014 13:50:54.784421 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:50:54 crc kubenswrapper[4837]: E1014 13:50:54.785153 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:51:07 crc kubenswrapper[4837]: I1014 13:51:07.784351 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:51:07 crc kubenswrapper[4837]: E1014 13:51:07.785152 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:51:20 crc kubenswrapper[4837]: I1014 13:51:20.784541 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:51:20 crc kubenswrapper[4837]: E1014 13:51:20.785388 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:51:34 crc kubenswrapper[4837]: I1014 13:51:34.784908 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:51:34 crc kubenswrapper[4837]: E1014 13:51:34.786061 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:51:49 crc kubenswrapper[4837]: I1014 13:51:49.785127 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:51:49 crc kubenswrapper[4837]: E1014 13:51:49.786495 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:52:00 crc kubenswrapper[4837]: I1014 13:52:00.786724 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:52:00 crc kubenswrapper[4837]: E1014 13:52:00.787580 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:52:11 crc kubenswrapper[4837]: I1014 13:52:11.784419 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:52:11 crc kubenswrapper[4837]: E1014 13:52:11.785379 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:52:22 crc kubenswrapper[4837]: I1014 13:52:22.809195 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:52:22 crc kubenswrapper[4837]: E1014 13:52:22.810593 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:52:36 crc kubenswrapper[4837]: I1014 13:52:36.785281 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:52:36 crc kubenswrapper[4837]: E1014 13:52:36.786523 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:52:51 crc kubenswrapper[4837]: I1014 13:52:51.785253 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:52:51 crc kubenswrapper[4837]: E1014 13:52:51.786013 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:53:04 crc kubenswrapper[4837]: I1014 13:53:04.787805 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:53:04 crc kubenswrapper[4837]: E1014 13:53:04.788728 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:53:15 crc kubenswrapper[4837]: I1014 13:53:15.785278 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:53:15 crc kubenswrapper[4837]: E1014 13:53:15.787049 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.931808 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-msgg7"] Oct 14 13:53:22 crc kubenswrapper[4837]: E1014 13:53:22.932499 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="registry-server" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.932516 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="registry-server" Oct 14 13:53:22 crc kubenswrapper[4837]: E1014 13:53:22.932533 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="extract-utilities" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.932540 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="extract-utilities" Oct 14 13:53:22 crc kubenswrapper[4837]: E1014 13:53:22.932563 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="extract-content" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.932571 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="extract-content" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.932767 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8390c75c-afa2-498e-baea-ebf42594094f" containerName="registry-server" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.934322 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.944396 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-msgg7"] Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.962214 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhlx\" (UniqueName: \"kubernetes.io/projected/94d4ccef-7edb-4e1d-b45e-6897f694e691-kube-api-access-flhlx\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.962255 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-utilities\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:22 crc kubenswrapper[4837]: I1014 13:53:22.962322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-catalog-content\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.063973 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-catalog-content\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.064407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhlx\" (UniqueName: \"kubernetes.io/projected/94d4ccef-7edb-4e1d-b45e-6897f694e691-kube-api-access-flhlx\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.064436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-utilities\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.064488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-catalog-content\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.064793 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-utilities\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.087320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhlx\" (UniqueName: \"kubernetes.io/projected/94d4ccef-7edb-4e1d-b45e-6897f694e691-kube-api-access-flhlx\") pod \"redhat-operators-msgg7\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.295116 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.782707 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-msgg7"] Oct 14 13:53:23 crc kubenswrapper[4837]: I1014 13:53:23.882980 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerStarted","Data":"38e3c8998755abc53e7dc85ebe102c49e5cba69ab814eccbcf6402b1166d205b"} Oct 14 13:53:24 crc kubenswrapper[4837]: I1014 13:53:24.894932 4837 generic.go:334] "Generic (PLEG): container finished" podID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerID="22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb" exitCode=0 Oct 14 13:53:24 crc kubenswrapper[4837]: I1014 13:53:24.895037 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerDied","Data":"22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb"} Oct 14 13:53:24 crc kubenswrapper[4837]: I1014 13:53:24.898548 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:53:26 crc kubenswrapper[4837]: I1014 13:53:26.917846 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerStarted","Data":"7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2"} Oct 14 13:53:27 crc kubenswrapper[4837]: I1014 13:53:27.934486 4837 generic.go:334] "Generic (PLEG): container finished" podID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerID="7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2" exitCode=0 Oct 14 13:53:27 crc kubenswrapper[4837]: I1014 13:53:27.934565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerDied","Data":"7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2"} Oct 14 13:53:28 crc kubenswrapper[4837]: I1014 13:53:28.945876 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerStarted","Data":"684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a"} Oct 14 13:53:28 crc kubenswrapper[4837]: I1014 13:53:28.969703 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-msgg7" podStartSLOduration=3.418318812 podStartE2EDuration="6.969688675s" podCreationTimestamp="2025-10-14 13:53:22 +0000 UTC" firstStartedPulling="2025-10-14 13:53:24.898283621 +0000 UTC m=+3142.815283444" lastFinishedPulling="2025-10-14 13:53:28.449653494 +0000 UTC m=+3146.366653307" observedRunningTime="2025-10-14 13:53:28.967752952 +0000 UTC m=+3146.884752785" watchObservedRunningTime="2025-10-14 13:53:28.969688675 +0000 UTC m=+3146.886688478" Oct 14 13:53:30 crc kubenswrapper[4837]: I1014 13:53:30.785488 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:53:30 crc kubenswrapper[4837]: E1014 13:53:30.786050 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:53:33 crc kubenswrapper[4837]: I1014 13:53:33.295897 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:33 crc kubenswrapper[4837]: I1014 13:53:33.296246 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:34 crc kubenswrapper[4837]: I1014 13:53:34.337773 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-msgg7" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="registry-server" probeResult="failure" output=< Oct 14 13:53:34 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Oct 14 13:53:34 crc kubenswrapper[4837]: > Oct 14 13:53:43 crc kubenswrapper[4837]: I1014 13:53:43.369246 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:43 crc kubenswrapper[4837]: I1014 13:53:43.417943 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:43 crc kubenswrapper[4837]: I1014 13:53:43.608031 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-msgg7"] Oct 14 13:53:43 crc kubenswrapper[4837]: I1014 13:53:43.784550 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:53:43 crc kubenswrapper[4837]: E1014 13:53:43.784879 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.080319 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-msgg7" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="registry-server" containerID="cri-o://684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a" gracePeriod=2 Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.593608 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.704623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-utilities\") pod \"94d4ccef-7edb-4e1d-b45e-6897f694e691\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.704676 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhlx\" (UniqueName: \"kubernetes.io/projected/94d4ccef-7edb-4e1d-b45e-6897f694e691-kube-api-access-flhlx\") pod \"94d4ccef-7edb-4e1d-b45e-6897f694e691\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.704857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-catalog-content\") pod \"94d4ccef-7edb-4e1d-b45e-6897f694e691\" (UID: \"94d4ccef-7edb-4e1d-b45e-6897f694e691\") " Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.705989 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-utilities" (OuterVolumeSpecName: "utilities") pod "94d4ccef-7edb-4e1d-b45e-6897f694e691" (UID: "94d4ccef-7edb-4e1d-b45e-6897f694e691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.710995 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d4ccef-7edb-4e1d-b45e-6897f694e691-kube-api-access-flhlx" (OuterVolumeSpecName: "kube-api-access-flhlx") pod "94d4ccef-7edb-4e1d-b45e-6897f694e691" (UID: "94d4ccef-7edb-4e1d-b45e-6897f694e691"). InnerVolumeSpecName "kube-api-access-flhlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.807451 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.807479 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhlx\" (UniqueName: \"kubernetes.io/projected/94d4ccef-7edb-4e1d-b45e-6897f694e691-kube-api-access-flhlx\") on node \"crc\" DevicePath \"\"" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.832336 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94d4ccef-7edb-4e1d-b45e-6897f694e691" (UID: "94d4ccef-7edb-4e1d-b45e-6897f694e691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:53:45 crc kubenswrapper[4837]: I1014 13:53:45.909805 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d4ccef-7edb-4e1d-b45e-6897f694e691-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.092061 4837 generic.go:334] "Generic (PLEG): container finished" podID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerID="684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a" exitCode=0 Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.092107 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerDied","Data":"684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a"} Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.092137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msgg7" event={"ID":"94d4ccef-7edb-4e1d-b45e-6897f694e691","Type":"ContainerDied","Data":"38e3c8998755abc53e7dc85ebe102c49e5cba69ab814eccbcf6402b1166d205b"} Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.092132 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-msgg7" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.092172 4837 scope.go:117] "RemoveContainer" containerID="684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.125476 4837 scope.go:117] "RemoveContainer" containerID="7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.125796 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-msgg7"] Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.138996 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-msgg7"] Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.146425 4837 scope.go:117] "RemoveContainer" containerID="22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.183550 4837 scope.go:117] "RemoveContainer" containerID="684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a" Oct 14 13:53:46 crc kubenswrapper[4837]: E1014 13:53:46.184068 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a\": container with ID starting with 684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a not found: ID does not exist" containerID="684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.184143 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a"} err="failed to get container status \"684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a\": rpc error: code = NotFound desc = could not find container \"684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a\": container with ID starting with 684c7e0581128257141e0ab21861e4fa0b0b5ee3be1584b6bce482235bb9612a not found: ID does not exist" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.184197 4837 scope.go:117] "RemoveContainer" containerID="7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2" Oct 14 13:53:46 crc kubenswrapper[4837]: E1014 13:53:46.184578 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2\": container with ID starting with 7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2 not found: ID does not exist" containerID="7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.184617 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2"} err="failed to get container status \"7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2\": rpc error: code = NotFound desc = could not find container \"7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2\": container with ID starting with 7f8a34c64fb7c405255fe0825fbe2b7935b90c8eb7004fb2ba2eddabe1114fd2 not found: ID does not exist" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.184643 4837 scope.go:117] "RemoveContainer" containerID="22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb" Oct 14 13:53:46 crc kubenswrapper[4837]: E1014 13:53:46.184930 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb\": container with ID starting with 22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb not found: ID does not exist" containerID="22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.184969 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb"} err="failed to get container status \"22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb\": rpc error: code = NotFound desc = could not find container \"22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb\": container with ID starting with 22ea05998bd66f8b2dc193b7ad850597f54e5feb4b8b2afcd45c4d8cde5f7fdb not found: ID does not exist" Oct 14 13:53:46 crc kubenswrapper[4837]: I1014 13:53:46.816763 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" path="/var/lib/kubelet/pods/94d4ccef-7edb-4e1d-b45e-6897f694e691/volumes" Oct 14 13:53:58 crc kubenswrapper[4837]: I1014 13:53:58.784971 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:53:58 crc kubenswrapper[4837]: E1014 13:53:58.786035 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:54:01 crc kubenswrapper[4837]: I1014 13:54:01.137445 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-6f7c5db7df-7tsqg" podUID="a7d3bc97-ce39-472b-860e-79b620b726f1" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 14 13:54:09 crc kubenswrapper[4837]: I1014 13:54:09.785063 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:54:09 crc kubenswrapper[4837]: E1014 13:54:09.786246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 13:54:21 crc kubenswrapper[4837]: I1014 13:54:21.784242 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:54:22 crc kubenswrapper[4837]: I1014 13:54:22.443996 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"54035085edd5f91a9f13e042f6da48247351b7f4344991c2baca4736f653e46f"} Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.113364 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gbztf"] Oct 14 13:56:41 crc kubenswrapper[4837]: E1014 13:56:41.114675 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="registry-server" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.114694 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="registry-server" Oct 14 13:56:41 crc kubenswrapper[4837]: E1014 13:56:41.114712 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="extract-content" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.114721 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="extract-content" Oct 14 13:56:41 crc kubenswrapper[4837]: E1014 13:56:41.114791 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="extract-utilities" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.114801 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="extract-utilities" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.115078 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d4ccef-7edb-4e1d-b45e-6897f694e691" containerName="registry-server" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.116880 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.122483 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbztf"] Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.139539 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.139589 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.261194 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-catalog-content\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.261273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdtcx\" (UniqueName: \"kubernetes.io/projected/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-kube-api-access-wdtcx\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.261339 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-utilities\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.363374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdtcx\" (UniqueName: \"kubernetes.io/projected/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-kube-api-access-wdtcx\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.363496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-utilities\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.363586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-catalog-content\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.364060 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-catalog-content\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.364293 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-utilities\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.386184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdtcx\" (UniqueName: \"kubernetes.io/projected/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-kube-api-access-wdtcx\") pod \"redhat-marketplace-gbztf\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.443536 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:41 crc kubenswrapper[4837]: I1014 13:56:41.966961 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbztf"] Oct 14 13:56:42 crc kubenswrapper[4837]: I1014 13:56:42.910571 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerID="dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3" exitCode=0 Oct 14 13:56:42 crc kubenswrapper[4837]: I1014 13:56:42.910614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbztf" event={"ID":"fd43d193-4af8-47fe-bfe3-be05e5ab6c49","Type":"ContainerDied","Data":"dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3"} Oct 14 13:56:42 crc kubenswrapper[4837]: I1014 13:56:42.910639 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbztf" event={"ID":"fd43d193-4af8-47fe-bfe3-be05e5ab6c49","Type":"ContainerStarted","Data":"a75e3c5048b7a7255789395e111179a6f7debc3cc47b5e90c68a5133d5f51430"} Oct 14 13:56:44 crc kubenswrapper[4837]: I1014 13:56:44.933522 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerID="83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463" exitCode=0 Oct 14 13:56:44 crc kubenswrapper[4837]: I1014 13:56:44.933665 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbztf" event={"ID":"fd43d193-4af8-47fe-bfe3-be05e5ab6c49","Type":"ContainerDied","Data":"83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463"} Oct 14 13:56:45 crc kubenswrapper[4837]: I1014 13:56:45.946928 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbztf" event={"ID":"fd43d193-4af8-47fe-bfe3-be05e5ab6c49","Type":"ContainerStarted","Data":"d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047"} Oct 14 13:56:45 crc kubenswrapper[4837]: I1014 13:56:45.966764 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gbztf" podStartSLOduration=2.358035151 podStartE2EDuration="4.966744503s" podCreationTimestamp="2025-10-14 13:56:41 +0000 UTC" firstStartedPulling="2025-10-14 13:56:42.912845929 +0000 UTC m=+3340.829845752" lastFinishedPulling="2025-10-14 13:56:45.521555281 +0000 UTC m=+3343.438555104" observedRunningTime="2025-10-14 13:56:45.96186498 +0000 UTC m=+3343.878864803" watchObservedRunningTime="2025-10-14 13:56:45.966744503 +0000 UTC m=+3343.883744316" Oct 14 13:56:51 crc kubenswrapper[4837]: I1014 13:56:51.444065 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:51 crc kubenswrapper[4837]: I1014 13:56:51.444513 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:51 crc kubenswrapper[4837]: I1014 13:56:51.502669 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:52 crc kubenswrapper[4837]: I1014 13:56:52.085644 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:52 crc kubenswrapper[4837]: I1014 13:56:52.138833 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbztf"] Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.039501 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gbztf" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="registry-server" containerID="cri-o://d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047" gracePeriod=2 Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.584491 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.783227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-utilities\") pod \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.783385 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-catalog-content\") pod \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.783627 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdtcx\" (UniqueName: \"kubernetes.io/projected/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-kube-api-access-wdtcx\") pod \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\" (UID: \"fd43d193-4af8-47fe-bfe3-be05e5ab6c49\") " Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.784070 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-utilities" (OuterVolumeSpecName: "utilities") pod "fd43d193-4af8-47fe-bfe3-be05e5ab6c49" (UID: "fd43d193-4af8-47fe-bfe3-be05e5ab6c49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.784598 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.790310 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-kube-api-access-wdtcx" (OuterVolumeSpecName: "kube-api-access-wdtcx") pod "fd43d193-4af8-47fe-bfe3-be05e5ab6c49" (UID: "fd43d193-4af8-47fe-bfe3-be05e5ab6c49"). InnerVolumeSpecName "kube-api-access-wdtcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.797707 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd43d193-4af8-47fe-bfe3-be05e5ab6c49" (UID: "fd43d193-4af8-47fe-bfe3-be05e5ab6c49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.886462 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:56:54 crc kubenswrapper[4837]: I1014 13:56:54.886508 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdtcx\" (UniqueName: \"kubernetes.io/projected/fd43d193-4af8-47fe-bfe3-be05e5ab6c49-kube-api-access-wdtcx\") on node \"crc\" DevicePath \"\"" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.051627 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerID="d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047" exitCode=0 Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.051672 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbztf" event={"ID":"fd43d193-4af8-47fe-bfe3-be05e5ab6c49","Type":"ContainerDied","Data":"d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047"} Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.051698 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbztf" event={"ID":"fd43d193-4af8-47fe-bfe3-be05e5ab6c49","Type":"ContainerDied","Data":"a75e3c5048b7a7255789395e111179a6f7debc3cc47b5e90c68a5133d5f51430"} Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.051714 4837 scope.go:117] "RemoveContainer" containerID="d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.051735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbztf" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.076579 4837 scope.go:117] "RemoveContainer" containerID="83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.084667 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbztf"] Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.093603 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbztf"] Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.097093 4837 scope.go:117] "RemoveContainer" containerID="dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.168005 4837 scope.go:117] "RemoveContainer" containerID="d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047" Oct 14 13:56:55 crc kubenswrapper[4837]: E1014 13:56:55.168427 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047\": container with ID starting with d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047 not found: ID does not exist" containerID="d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.168466 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047"} err="failed to get container status \"d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047\": rpc error: code = NotFound desc = could not find container \"d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047\": container with ID starting with d4a138ee341a20fbbc7ab886cd1bd4a7429e9fcc39e415cba206364287b7d047 not found: ID does not exist" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.168495 4837 scope.go:117] "RemoveContainer" containerID="83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463" Oct 14 13:56:55 crc kubenswrapper[4837]: E1014 13:56:55.169018 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463\": container with ID starting with 83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463 not found: ID does not exist" containerID="83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.169058 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463"} err="failed to get container status \"83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463\": rpc error: code = NotFound desc = could not find container \"83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463\": container with ID starting with 83e2282218037e304b551eea3f4047f54a49bce90139d6bf41d17e6510b15463 not found: ID does not exist" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.169086 4837 scope.go:117] "RemoveContainer" containerID="dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3" Oct 14 13:56:55 crc kubenswrapper[4837]: E1014 13:56:55.169414 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3\": container with ID starting with dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3 not found: ID does not exist" containerID="dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3" Oct 14 13:56:55 crc kubenswrapper[4837]: I1014 13:56:55.169455 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3"} err="failed to get container status \"dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3\": rpc error: code = NotFound desc = could not find container \"dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3\": container with ID starting with dfaf2a2aaeb97eb58fae6a99e57601e083b6088267728dfa89ef0b76c37e4fe3 not found: ID does not exist" Oct 14 13:56:56 crc kubenswrapper[4837]: I1014 13:56:56.800349 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" path="/var/lib/kubelet/pods/fd43d193-4af8-47fe-bfe3-be05e5ab6c49/volumes" Oct 14 13:57:11 crc kubenswrapper[4837]: I1014 13:57:11.140641 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:57:11 crc kubenswrapper[4837]: I1014 13:57:11.141318 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.140462 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.141029 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.141086 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.141995 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54035085edd5f91a9f13e042f6da48247351b7f4344991c2baca4736f653e46f"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.142076 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://54035085edd5f91a9f13e042f6da48247351b7f4344991c2baca4736f653e46f" gracePeriod=600 Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.482609 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="54035085edd5f91a9f13e042f6da48247351b7f4344991c2baca4736f653e46f" exitCode=0 Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.482690 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"54035085edd5f91a9f13e042f6da48247351b7f4344991c2baca4736f653e46f"} Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.482960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb"} Oct 14 13:57:41 crc kubenswrapper[4837]: I1014 13:57:41.482981 4837 scope.go:117] "RemoveContainer" containerID="a630866f2be09767c258fcbf33ba266ad49dc2aacbc1eee20ba6f27a5d9811de" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.777295 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tv72q"] Oct 14 13:58:00 crc kubenswrapper[4837]: E1014 13:58:00.778059 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="extract-content" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.778073 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="extract-content" Oct 14 13:58:00 crc kubenswrapper[4837]: E1014 13:58:00.778085 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="extract-utilities" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.778091 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="extract-utilities" Oct 14 13:58:00 crc kubenswrapper[4837]: E1014 13:58:00.778130 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="registry-server" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.778142 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="registry-server" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.778392 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd43d193-4af8-47fe-bfe3-be05e5ab6c49" containerName="registry-server" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.779658 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.806612 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tv72q"] Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.875750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-utilities\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.875881 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krj68\" (UniqueName: \"kubernetes.io/projected/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-kube-api-access-krj68\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.875967 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-catalog-content\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.978109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-utilities\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.978268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krj68\" (UniqueName: \"kubernetes.io/projected/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-kube-api-access-krj68\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.978365 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-catalog-content\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.978852 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-catalog-content\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.978851 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-utilities\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:00 crc kubenswrapper[4837]: I1014 13:58:00.996766 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krj68\" (UniqueName: \"kubernetes.io/projected/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-kube-api-access-krj68\") pod \"community-operators-tv72q\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:01 crc kubenswrapper[4837]: I1014 13:58:01.151581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:01 crc kubenswrapper[4837]: I1014 13:58:01.678469 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tv72q"] Oct 14 13:58:02 crc kubenswrapper[4837]: I1014 13:58:02.684527 4837 generic.go:334] "Generic (PLEG): container finished" podID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerID="502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0" exitCode=0 Oct 14 13:58:02 crc kubenswrapper[4837]: I1014 13:58:02.684624 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv72q" event={"ID":"9828f6ed-fa9c-457a-9f05-eb0b33f886c2","Type":"ContainerDied","Data":"502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0"} Oct 14 13:58:02 crc kubenswrapper[4837]: I1014 13:58:02.684918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv72q" event={"ID":"9828f6ed-fa9c-457a-9f05-eb0b33f886c2","Type":"ContainerStarted","Data":"d43e7d05ae828470aa187a5c1a5ceb92bebcfa56c26e5fc42d121cc419db5bf7"} Oct 14 13:58:04 crc kubenswrapper[4837]: I1014 13:58:04.710897 4837 generic.go:334] "Generic (PLEG): container finished" podID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerID="e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f" exitCode=0 Oct 14 13:58:04 crc kubenswrapper[4837]: I1014 13:58:04.711016 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv72q" event={"ID":"9828f6ed-fa9c-457a-9f05-eb0b33f886c2","Type":"ContainerDied","Data":"e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f"} Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.575652 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjlcc"] Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.577599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.607979 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjlcc"] Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.676578 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-utilities\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.676647 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pf47\" (UniqueName: \"kubernetes.io/projected/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-kube-api-access-5pf47\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.676700 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-catalog-content\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.778957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-utilities\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.779027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pf47\" (UniqueName: \"kubernetes.io/projected/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-kube-api-access-5pf47\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.779104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-catalog-content\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.779738 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-utilities\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.779964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-catalog-content\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.812084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pf47\" (UniqueName: \"kubernetes.io/projected/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-kube-api-access-5pf47\") pod \"certified-operators-qjlcc\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:05 crc kubenswrapper[4837]: I1014 13:58:05.914731 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:06 crc kubenswrapper[4837]: I1014 13:58:06.453564 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjlcc"] Oct 14 13:58:06 crc kubenswrapper[4837]: I1014 13:58:06.739104 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv72q" event={"ID":"9828f6ed-fa9c-457a-9f05-eb0b33f886c2","Type":"ContainerStarted","Data":"2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52"} Oct 14 13:58:06 crc kubenswrapper[4837]: I1014 13:58:06.741209 4837 generic.go:334] "Generic (PLEG): container finished" podID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerID="6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce" exitCode=0 Oct 14 13:58:06 crc kubenswrapper[4837]: I1014 13:58:06.741266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjlcc" event={"ID":"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0","Type":"ContainerDied","Data":"6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce"} Oct 14 13:58:06 crc kubenswrapper[4837]: I1014 13:58:06.741299 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjlcc" event={"ID":"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0","Type":"ContainerStarted","Data":"9d38a3ac9a3b9bbf811c29304454d7bfa3d0b50a4d51717a1dba68efb8ac2bc5"} Oct 14 13:58:06 crc kubenswrapper[4837]: I1014 13:58:06.764435 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tv72q" podStartSLOduration=3.231065684 podStartE2EDuration="6.764416668s" podCreationTimestamp="2025-10-14 13:58:00 +0000 UTC" firstStartedPulling="2025-10-14 13:58:02.687515987 +0000 UTC m=+3420.604515840" lastFinishedPulling="2025-10-14 13:58:06.220867011 +0000 UTC m=+3424.137866824" observedRunningTime="2025-10-14 13:58:06.760901913 +0000 UTC m=+3424.677901736" watchObservedRunningTime="2025-10-14 13:58:06.764416668 +0000 UTC m=+3424.681416471" Oct 14 13:58:08 crc kubenswrapper[4837]: I1014 13:58:08.776996 4837 generic.go:334] "Generic (PLEG): container finished" podID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerID="4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9" exitCode=0 Oct 14 13:58:08 crc kubenswrapper[4837]: I1014 13:58:08.777078 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjlcc" event={"ID":"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0","Type":"ContainerDied","Data":"4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9"} Oct 14 13:58:09 crc kubenswrapper[4837]: I1014 13:58:09.804236 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjlcc" event={"ID":"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0","Type":"ContainerStarted","Data":"6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20"} Oct 14 13:58:11 crc kubenswrapper[4837]: I1014 13:58:11.152479 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:11 crc kubenswrapper[4837]: I1014 13:58:11.153550 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:11 crc kubenswrapper[4837]: I1014 13:58:11.200937 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:11 crc kubenswrapper[4837]: I1014 13:58:11.226637 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjlcc" podStartSLOduration=3.582330294 podStartE2EDuration="6.22661815s" podCreationTimestamp="2025-10-14 13:58:05 +0000 UTC" firstStartedPulling="2025-10-14 13:58:06.743329147 +0000 UTC m=+3424.660328960" lastFinishedPulling="2025-10-14 13:58:09.387616993 +0000 UTC m=+3427.304616816" observedRunningTime="2025-10-14 13:58:09.825781715 +0000 UTC m=+3427.742781528" watchObservedRunningTime="2025-10-14 13:58:11.22661815 +0000 UTC m=+3429.143617973" Oct 14 13:58:11 crc kubenswrapper[4837]: I1014 13:58:11.879560 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:12 crc kubenswrapper[4837]: I1014 13:58:12.354437 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tv72q"] Oct 14 13:58:13 crc kubenswrapper[4837]: I1014 13:58:13.848991 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tv72q" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="registry-server" containerID="cri-o://2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52" gracePeriod=2 Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.363927 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.480006 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krj68\" (UniqueName: \"kubernetes.io/projected/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-kube-api-access-krj68\") pod \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.480081 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-catalog-content\") pod \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.480225 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-utilities\") pod \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\" (UID: \"9828f6ed-fa9c-457a-9f05-eb0b33f886c2\") " Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.482417 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-utilities" (OuterVolumeSpecName: "utilities") pod "9828f6ed-fa9c-457a-9f05-eb0b33f886c2" (UID: "9828f6ed-fa9c-457a-9f05-eb0b33f886c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.489676 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-kube-api-access-krj68" (OuterVolumeSpecName: "kube-api-access-krj68") pod "9828f6ed-fa9c-457a-9f05-eb0b33f886c2" (UID: "9828f6ed-fa9c-457a-9f05-eb0b33f886c2"). InnerVolumeSpecName "kube-api-access-krj68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.583050 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.583090 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krj68\" (UniqueName: \"kubernetes.io/projected/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-kube-api-access-krj68\") on node \"crc\" DevicePath \"\"" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.863760 4837 generic.go:334] "Generic (PLEG): container finished" podID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerID="2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52" exitCode=0 Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.863815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv72q" event={"ID":"9828f6ed-fa9c-457a-9f05-eb0b33f886c2","Type":"ContainerDied","Data":"2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52"} Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.863857 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv72q" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.863861 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv72q" event={"ID":"9828f6ed-fa9c-457a-9f05-eb0b33f886c2","Type":"ContainerDied","Data":"d43e7d05ae828470aa187a5c1a5ceb92bebcfa56c26e5fc42d121cc419db5bf7"} Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.863877 4837 scope.go:117] "RemoveContainer" containerID="2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.885107 4837 scope.go:117] "RemoveContainer" containerID="e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.922438 4837 scope.go:117] "RemoveContainer" containerID="502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.954090 4837 scope.go:117] "RemoveContainer" containerID="2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52" Oct 14 13:58:14 crc kubenswrapper[4837]: E1014 13:58:14.954750 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52\": container with ID starting with 2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52 not found: ID does not exist" containerID="2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.954795 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52"} err="failed to get container status \"2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52\": rpc error: code = NotFound desc = could not find container \"2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52\": container with ID starting with 2f9287f2f74eba89b72c931ed8173f622aad75fa4f24acb885abd0e0df596f52 not found: ID does not exist" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.954821 4837 scope.go:117] "RemoveContainer" containerID="e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f" Oct 14 13:58:14 crc kubenswrapper[4837]: E1014 13:58:14.955248 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f\": container with ID starting with e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f not found: ID does not exist" containerID="e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.955321 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f"} err="failed to get container status \"e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f\": rpc error: code = NotFound desc = could not find container \"e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f\": container with ID starting with e9dd5b12fc95b2d02307892aec81a55e52f20bcf82e1e3a15042a561b875ce2f not found: ID does not exist" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.955344 4837 scope.go:117] "RemoveContainer" containerID="502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0" Oct 14 13:58:14 crc kubenswrapper[4837]: E1014 13:58:14.955717 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0\": container with ID starting with 502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0 not found: ID does not exist" containerID="502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.955750 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0"} err="failed to get container status \"502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0\": rpc error: code = NotFound desc = could not find container \"502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0\": container with ID starting with 502fc460e963a72c4efc3813dbe7445be5dc5f99dc767a74b6eda5a9fcebf8b0 not found: ID does not exist" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.972600 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9828f6ed-fa9c-457a-9f05-eb0b33f886c2" (UID: "9828f6ed-fa9c-457a-9f05-eb0b33f886c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:58:14 crc kubenswrapper[4837]: I1014 13:58:14.989977 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9828f6ed-fa9c-457a-9f05-eb0b33f886c2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:58:15 crc kubenswrapper[4837]: I1014 13:58:15.212055 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tv72q"] Oct 14 13:58:15 crc kubenswrapper[4837]: I1014 13:58:15.227221 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tv72q"] Oct 14 13:58:15 crc kubenswrapper[4837]: I1014 13:58:15.916292 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:15 crc kubenswrapper[4837]: I1014 13:58:15.916545 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:15 crc kubenswrapper[4837]: I1014 13:58:15.974049 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:16 crc kubenswrapper[4837]: I1014 13:58:16.802981 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" path="/var/lib/kubelet/pods/9828f6ed-fa9c-457a-9f05-eb0b33f886c2/volumes" Oct 14 13:58:16 crc kubenswrapper[4837]: I1014 13:58:16.953618 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:17 crc kubenswrapper[4837]: I1014 13:58:17.755409 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjlcc"] Oct 14 13:58:18 crc kubenswrapper[4837]: I1014 13:58:18.922862 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjlcc" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="registry-server" containerID="cri-o://6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20" gracePeriod=2 Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.479915 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.576735 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-utilities\") pod \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.576785 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pf47\" (UniqueName: \"kubernetes.io/projected/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-kube-api-access-5pf47\") pod \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.576928 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-catalog-content\") pod \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\" (UID: \"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0\") " Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.578384 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-utilities" (OuterVolumeSpecName: "utilities") pod "b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" (UID: "b7ebfae1-b564-47d4-84c3-38c31ff0f8a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.583347 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-kube-api-access-5pf47" (OuterVolumeSpecName: "kube-api-access-5pf47") pod "b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" (UID: "b7ebfae1-b564-47d4-84c3-38c31ff0f8a0"). InnerVolumeSpecName "kube-api-access-5pf47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.622007 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" (UID: "b7ebfae1-b564-47d4-84c3-38c31ff0f8a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.679098 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.679142 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.679153 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pf47\" (UniqueName: \"kubernetes.io/projected/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0-kube-api-access-5pf47\") on node \"crc\" DevicePath \"\"" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.933456 4837 generic.go:334] "Generic (PLEG): container finished" podID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerID="6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20" exitCode=0 Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.933512 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjlcc" event={"ID":"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0","Type":"ContainerDied","Data":"6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20"} Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.933544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjlcc" event={"ID":"b7ebfae1-b564-47d4-84c3-38c31ff0f8a0","Type":"ContainerDied","Data":"9d38a3ac9a3b9bbf811c29304454d7bfa3d0b50a4d51717a1dba68efb8ac2bc5"} Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.933563 4837 scope.go:117] "RemoveContainer" containerID="6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.933716 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjlcc" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.968479 4837 scope.go:117] "RemoveContainer" containerID="4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9" Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.972640 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjlcc"] Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.979026 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjlcc"] Oct 14 13:58:19 crc kubenswrapper[4837]: I1014 13:58:19.992330 4837 scope.go:117] "RemoveContainer" containerID="6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.062403 4837 scope.go:117] "RemoveContainer" containerID="6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20" Oct 14 13:58:20 crc kubenswrapper[4837]: E1014 13:58:20.062916 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20\": container with ID starting with 6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20 not found: ID does not exist" containerID="6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.062955 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20"} err="failed to get container status \"6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20\": rpc error: code = NotFound desc = could not find container \"6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20\": container with ID starting with 6f068c66fb6dd96e4efb5e38208126917ab78e1dcf7a68afb75f4e02a8f5fd20 not found: ID does not exist" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.062983 4837 scope.go:117] "RemoveContainer" containerID="4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9" Oct 14 13:58:20 crc kubenswrapper[4837]: E1014 13:58:20.063809 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9\": container with ID starting with 4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9 not found: ID does not exist" containerID="4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.063842 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9"} err="failed to get container status \"4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9\": rpc error: code = NotFound desc = could not find container \"4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9\": container with ID starting with 4e2964871560c1ea40dbdcf25257276a3f0facebce513a604f1d86f7f3dc60a9 not found: ID does not exist" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.063863 4837 scope.go:117] "RemoveContainer" containerID="6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce" Oct 14 13:58:20 crc kubenswrapper[4837]: E1014 13:58:20.064306 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce\": container with ID starting with 6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce not found: ID does not exist" containerID="6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.064351 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce"} err="failed to get container status \"6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce\": rpc error: code = NotFound desc = could not find container \"6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce\": container with ID starting with 6ae61d2b524df37fab80296456172dcdc894d07b66bf513dbbcf9a12f882b5ce not found: ID does not exist" Oct 14 13:58:20 crc kubenswrapper[4837]: I1014 13:58:20.802250 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" path="/var/lib/kubelet/pods/b7ebfae1-b564-47d4-84c3-38c31ff0f8a0/volumes" Oct 14 13:59:10 crc kubenswrapper[4837]: I1014 13:59:10.401063 4837 generic.go:334] "Generic (PLEG): container finished" podID="beac4f98-00d6-438b-86cc-2f85d2ca1f96" containerID="94d1f8c63ceef21548fb42d20c1501934d8535273189c35a598ba5fe1716b8da" exitCode=0 Oct 14 13:59:10 crc kubenswrapper[4837]: I1014 13:59:10.401170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"beac4f98-00d6-438b-86cc-2f85d2ca1f96","Type":"ContainerDied","Data":"94d1f8c63ceef21548fb42d20c1501934d8535273189c35a598ba5fe1716b8da"} Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.830855 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.902845 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config-secret\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.902994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ssh-key\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ggtp\" (UniqueName: \"kubernetes.io/projected/beac4f98-00d6-438b-86cc-2f85d2ca1f96-kube-api-access-8ggtp\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903138 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ca-certs\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903218 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-workdir\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903290 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-config-data\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903391 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903493 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-temporary\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.903542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config\") pod \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\" (UID: \"beac4f98-00d6-438b-86cc-2f85d2ca1f96\") " Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.904943 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-config-data" (OuterVolumeSpecName: "config-data") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.905563 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.910143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.915486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beac4f98-00d6-438b-86cc-2f85d2ca1f96-kube-api-access-8ggtp" (OuterVolumeSpecName: "kube-api-access-8ggtp") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "kube-api-access-8ggtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.918464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.933693 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.947098 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.961375 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:59:11 crc kubenswrapper[4837]: I1014 13:59:11.987081 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "beac4f98-00d6-438b-86cc-2f85d2ca1f96" (UID: "beac4f98-00d6-438b-86cc-2f85d2ca1f96"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005050 4837 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005082 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005095 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005123 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005158 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/beac4f98-00d6-438b-86cc-2f85d2ca1f96-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005197 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005214 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005226 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beac4f98-00d6-438b-86cc-2f85d2ca1f96-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.005235 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ggtp\" (UniqueName: \"kubernetes.io/projected/beac4f98-00d6-438b-86cc-2f85d2ca1f96-kube-api-access-8ggtp\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.026750 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.107137 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.425055 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"beac4f98-00d6-438b-86cc-2f85d2ca1f96","Type":"ContainerDied","Data":"a14b21bea8a80ed7a7c683d09fdfaa2484e6f0f190cac8f5a02c2de43911e0be"} Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.425421 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a14b21bea8a80ed7a7c683d09fdfaa2484e6f0f190cac8f5a02c2de43911e0be" Oct 14 13:59:12 crc kubenswrapper[4837]: I1014 13:59:12.425319 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.124994 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.126698 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="registry-server" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.126775 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="registry-server" Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.126844 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="extract-utilities" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.126908 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="extract-utilities" Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.126982 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="extract-content" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127039 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="extract-content" Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.127105 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="extract-utilities" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127179 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="extract-utilities" Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.127248 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="registry-server" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127309 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="registry-server" Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.127369 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beac4f98-00d6-438b-86cc-2f85d2ca1f96" containerName="tempest-tests-tempest-tests-runner" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127425 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="beac4f98-00d6-438b-86cc-2f85d2ca1f96" containerName="tempest-tests-tempest-tests-runner" Oct 14 13:59:20 crc kubenswrapper[4837]: E1014 13:59:20.127488 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="extract-content" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127541 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="extract-content" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127794 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ebfae1-b564-47d4-84c3-38c31ff0f8a0" containerName="registry-server" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127880 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9828f6ed-fa9c-457a-9f05-eb0b33f886c2" containerName="registry-server" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.127940 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="beac4f98-00d6-438b-86cc-2f85d2ca1f96" containerName="tempest-tests-tempest-tests-runner" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.128685 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.132733 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gbfpm" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.143438 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.263194 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq8w9\" (UniqueName: \"kubernetes.io/projected/5726ac57-50cd-4985-a6cd-86a9683bb283-kube-api-access-nq8w9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.263471 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.365721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq8w9\" (UniqueName: \"kubernetes.io/projected/5726ac57-50cd-4985-a6cd-86a9683bb283-kube-api-access-nq8w9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.365764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.366400 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.384839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq8w9\" (UniqueName: \"kubernetes.io/projected/5726ac57-50cd-4985-a6cd-86a9683bb283-kube-api-access-nq8w9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.403124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5726ac57-50cd-4985-a6cd-86a9683bb283\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.490994 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.935221 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 14 13:59:20 crc kubenswrapper[4837]: I1014 13:59:20.939700 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:59:21 crc kubenswrapper[4837]: I1014 13:59:21.519068 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5726ac57-50cd-4985-a6cd-86a9683bb283","Type":"ContainerStarted","Data":"c51379d5b54924a601c20e528d9e1546e82bd2901dac2baf155e91bb497bd46b"} Oct 14 13:59:22 crc kubenswrapper[4837]: I1014 13:59:22.537267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5726ac57-50cd-4985-a6cd-86a9683bb283","Type":"ContainerStarted","Data":"31e513777c667294ce906193593c297a05d4caa7f2b78c5a42d5bf256d76d9ea"} Oct 14 13:59:22 crc kubenswrapper[4837]: I1014 13:59:22.566700 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.603816994 podStartE2EDuration="2.566670743s" podCreationTimestamp="2025-10-14 13:59:20 +0000 UTC" firstStartedPulling="2025-10-14 13:59:20.939440533 +0000 UTC m=+3498.856440356" lastFinishedPulling="2025-10-14 13:59:21.902294272 +0000 UTC m=+3499.819294105" observedRunningTime="2025-10-14 13:59:22.558952863 +0000 UTC m=+3500.475952736" watchObservedRunningTime="2025-10-14 13:59:22.566670743 +0000 UTC m=+3500.483670606" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.691438 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-96xbk/must-gather-hs27l"] Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.693987 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.695606 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-96xbk"/"openshift-service-ca.crt" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.695836 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-96xbk"/"kube-root-ca.crt" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.696427 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-96xbk"/"default-dockercfg-j6hfp" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.701134 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-96xbk/must-gather-hs27l"] Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.747627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8w7\" (UniqueName: \"kubernetes.io/projected/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-kube-api-access-dz8w7\") pod \"must-gather-hs27l\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.748316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-must-gather-output\") pod \"must-gather-hs27l\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.850381 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8w7\" (UniqueName: \"kubernetes.io/projected/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-kube-api-access-dz8w7\") pod \"must-gather-hs27l\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.850602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-must-gather-output\") pod \"must-gather-hs27l\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.851281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-must-gather-output\") pod \"must-gather-hs27l\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:39 crc kubenswrapper[4837]: I1014 13:59:39.869099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8w7\" (UniqueName: \"kubernetes.io/projected/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-kube-api-access-dz8w7\") pod \"must-gather-hs27l\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:40 crc kubenswrapper[4837]: I1014 13:59:40.024539 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 13:59:40 crc kubenswrapper[4837]: I1014 13:59:40.500810 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-96xbk/must-gather-hs27l"] Oct 14 13:59:40 crc kubenswrapper[4837]: I1014 13:59:40.738687 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/must-gather-hs27l" event={"ID":"19fc4828-1105-4f55-bfd4-9b8bc2b8403b","Type":"ContainerStarted","Data":"8351b9d1d3220aa14979198c2c9bd47f85bca7de6ed41414432bf7aaf0ee00a7"} Oct 14 13:59:41 crc kubenswrapper[4837]: I1014 13:59:41.139932 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 13:59:41 crc kubenswrapper[4837]: I1014 13:59:41.140021 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 13:59:44 crc kubenswrapper[4837]: I1014 13:59:44.779952 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/must-gather-hs27l" event={"ID":"19fc4828-1105-4f55-bfd4-9b8bc2b8403b","Type":"ContainerStarted","Data":"0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1"} Oct 14 13:59:44 crc kubenswrapper[4837]: I1014 13:59:44.780612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/must-gather-hs27l" event={"ID":"19fc4828-1105-4f55-bfd4-9b8bc2b8403b","Type":"ContainerStarted","Data":"983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2"} Oct 14 13:59:44 crc kubenswrapper[4837]: I1014 13:59:44.805093 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-96xbk/must-gather-hs27l" podStartSLOduration=2.207500778 podStartE2EDuration="5.805073413s" podCreationTimestamp="2025-10-14 13:59:39 +0000 UTC" firstStartedPulling="2025-10-14 13:59:40.513209887 +0000 UTC m=+3518.430209700" lastFinishedPulling="2025-10-14 13:59:44.110782522 +0000 UTC m=+3522.027782335" observedRunningTime="2025-10-14 13:59:44.803887322 +0000 UTC m=+3522.720887145" watchObservedRunningTime="2025-10-14 13:59:44.805073413 +0000 UTC m=+3522.722073226" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.026686 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-96xbk/crc-debug-pxpm9"] Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.028550 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.107379 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04d915b-b7fb-4d05-b341-acc770eb1f92-host\") pod \"crc-debug-pxpm9\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.107598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fksd8\" (UniqueName: \"kubernetes.io/projected/d04d915b-b7fb-4d05-b341-acc770eb1f92-kube-api-access-fksd8\") pod \"crc-debug-pxpm9\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.209121 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fksd8\" (UniqueName: \"kubernetes.io/projected/d04d915b-b7fb-4d05-b341-acc770eb1f92-kube-api-access-fksd8\") pod \"crc-debug-pxpm9\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.209285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04d915b-b7fb-4d05-b341-acc770eb1f92-host\") pod \"crc-debug-pxpm9\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.209408 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04d915b-b7fb-4d05-b341-acc770eb1f92-host\") pod \"crc-debug-pxpm9\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.230954 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fksd8\" (UniqueName: \"kubernetes.io/projected/d04d915b-b7fb-4d05-b341-acc770eb1f92-kube-api-access-fksd8\") pod \"crc-debug-pxpm9\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.351921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 13:59:48 crc kubenswrapper[4837]: W1014 13:59:48.394720 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd04d915b_b7fb_4d05_b341_acc770eb1f92.slice/crio-b92829689338c0c0ba4b13cdb29ed03c67e7de0fe3a77b951703b479eae39019 WatchSource:0}: Error finding container b92829689338c0c0ba4b13cdb29ed03c67e7de0fe3a77b951703b479eae39019: Status 404 returned error can't find the container with id b92829689338c0c0ba4b13cdb29ed03c67e7de0fe3a77b951703b479eae39019 Oct 14 13:59:48 crc kubenswrapper[4837]: I1014 13:59:48.834383 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" event={"ID":"d04d915b-b7fb-4d05-b341-acc770eb1f92","Type":"ContainerStarted","Data":"b92829689338c0c0ba4b13cdb29ed03c67e7de0fe3a77b951703b479eae39019"} Oct 14 13:59:58 crc kubenswrapper[4837]: I1014 13:59:58.930288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" event={"ID":"d04d915b-b7fb-4d05-b341-acc770eb1f92","Type":"ContainerStarted","Data":"eac1fb23e1f32330e4c73bbb3d3e3fd64011b059a64d3136f58670dbd2a19e3c"} Oct 14 13:59:58 crc kubenswrapper[4837]: I1014 13:59:58.950406 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" podStartSLOduration=1.174234474 podStartE2EDuration="10.950385717s" podCreationTimestamp="2025-10-14 13:59:48 +0000 UTC" firstStartedPulling="2025-10-14 13:59:48.397951641 +0000 UTC m=+3526.314951454" lastFinishedPulling="2025-10-14 13:59:58.174102874 +0000 UTC m=+3536.091102697" observedRunningTime="2025-10-14 13:59:58.942077833 +0000 UTC m=+3536.859077646" watchObservedRunningTime="2025-10-14 13:59:58.950385717 +0000 UTC m=+3536.867385530" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.177183 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5"] Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.178562 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.180553 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.197695 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.197867 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5"] Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.360351 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4bp\" (UniqueName: \"kubernetes.io/projected/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-kube-api-access-4m4bp\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.360517 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-config-volume\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.360753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-secret-volume\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.462510 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4bp\" (UniqueName: \"kubernetes.io/projected/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-kube-api-access-4m4bp\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.462921 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-config-volume\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.463009 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-secret-volume\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.463737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-config-volume\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.479784 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-secret-volume\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.488033 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4bp\" (UniqueName: \"kubernetes.io/projected/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-kube-api-access-4m4bp\") pod \"collect-profiles-29340840-pskw5\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.503746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:00 crc kubenswrapper[4837]: I1014 14:00:00.974200 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5"] Oct 14 14:00:00 crc kubenswrapper[4837]: W1014 14:00:00.982960 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca5bb57_a0c3_43c4_8458_ce0607d6df2d.slice/crio-a491ca6549c45aed20fa052b9f137673cd0645eee02e1636fad83632192cb0b5 WatchSource:0}: Error finding container a491ca6549c45aed20fa052b9f137673cd0645eee02e1636fad83632192cb0b5: Status 404 returned error can't find the container with id a491ca6549c45aed20fa052b9f137673cd0645eee02e1636fad83632192cb0b5 Oct 14 14:00:01 crc kubenswrapper[4837]: I1014 14:00:01.959475 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" event={"ID":"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d","Type":"ContainerStarted","Data":"e8d623cb5a45effad4c3d215928d235e7b4a995c050a0162bfdf5f9f146a67df"} Oct 14 14:00:01 crc kubenswrapper[4837]: I1014 14:00:01.959782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" event={"ID":"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d","Type":"ContainerStarted","Data":"a491ca6549c45aed20fa052b9f137673cd0645eee02e1636fad83632192cb0b5"} Oct 14 14:00:02 crc kubenswrapper[4837]: I1014 14:00:02.971108 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" containerID="e8d623cb5a45effad4c3d215928d235e7b4a995c050a0162bfdf5f9f146a67df" exitCode=0 Oct 14 14:00:02 crc kubenswrapper[4837]: I1014 14:00:02.971220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" event={"ID":"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d","Type":"ContainerDied","Data":"e8d623cb5a45effad4c3d215928d235e7b4a995c050a0162bfdf5f9f146a67df"} Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.348726 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.432349 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-config-volume\") pod \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.432506 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4bp\" (UniqueName: \"kubernetes.io/projected/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-kube-api-access-4m4bp\") pod \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.432618 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-secret-volume\") pod \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\" (UID: \"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d\") " Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.440768 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-config-volume" (OuterVolumeSpecName: "config-volume") pod "7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" (UID: "7ca5bb57-a0c3-43c4-8458-ce0607d6df2d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.447305 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-kube-api-access-4m4bp" (OuterVolumeSpecName: "kube-api-access-4m4bp") pod "7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" (UID: "7ca5bb57-a0c3-43c4-8458-ce0607d6df2d"). InnerVolumeSpecName "kube-api-access-4m4bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.449345 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" (UID: "7ca5bb57-a0c3-43c4-8458-ce0607d6df2d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.534457 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.534752 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m4bp\" (UniqueName: \"kubernetes.io/projected/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-kube-api-access-4m4bp\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.534764 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ca5bb57-a0c3-43c4-8458-ce0607d6df2d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.990003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" event={"ID":"7ca5bb57-a0c3-43c4-8458-ce0607d6df2d","Type":"ContainerDied","Data":"a491ca6549c45aed20fa052b9f137673cd0645eee02e1636fad83632192cb0b5"} Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.990050 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a491ca6549c45aed20fa052b9f137673cd0645eee02e1636fad83632192cb0b5" Oct 14 14:00:04 crc kubenswrapper[4837]: I1014 14:00:04.990115 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340840-pskw5" Oct 14 14:00:05 crc kubenswrapper[4837]: I1014 14:00:05.423590 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln"] Oct 14 14:00:05 crc kubenswrapper[4837]: I1014 14:00:05.435306 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-9cnln"] Oct 14 14:00:06 crc kubenswrapper[4837]: I1014 14:00:06.799317 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c1cd8b-17e5-454d-8925-f75cd0539c12" path="/var/lib/kubelet/pods/e5c1cd8b-17e5-454d-8925-f75cd0539c12/volumes" Oct 14 14:00:08 crc kubenswrapper[4837]: I1014 14:00:08.166934 4837 scope.go:117] "RemoveContainer" containerID="89b8d4715045f5249b4eb01bd2eedd2847326fe345ca530a88bf6de970df01cd" Oct 14 14:00:11 crc kubenswrapper[4837]: I1014 14:00:11.140368 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 14:00:11 crc kubenswrapper[4837]: I1014 14:00:11.140896 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 14:00:37 crc kubenswrapper[4837]: I1014 14:00:37.283117 4837 generic.go:334] "Generic (PLEG): container finished" podID="d04d915b-b7fb-4d05-b341-acc770eb1f92" containerID="eac1fb23e1f32330e4c73bbb3d3e3fd64011b059a64d3136f58670dbd2a19e3c" exitCode=0 Oct 14 14:00:37 crc kubenswrapper[4837]: I1014 14:00:37.283237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" event={"ID":"d04d915b-b7fb-4d05-b341-acc770eb1f92","Type":"ContainerDied","Data":"eac1fb23e1f32330e4c73bbb3d3e3fd64011b059a64d3136f58670dbd2a19e3c"} Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.429636 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.462237 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-96xbk/crc-debug-pxpm9"] Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.471781 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-96xbk/crc-debug-pxpm9"] Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.482593 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04d915b-b7fb-4d05-b341-acc770eb1f92-host\") pod \"d04d915b-b7fb-4d05-b341-acc770eb1f92\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.482650 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fksd8\" (UniqueName: \"kubernetes.io/projected/d04d915b-b7fb-4d05-b341-acc770eb1f92-kube-api-access-fksd8\") pod \"d04d915b-b7fb-4d05-b341-acc770eb1f92\" (UID: \"d04d915b-b7fb-4d05-b341-acc770eb1f92\") " Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.482762 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04d915b-b7fb-4d05-b341-acc770eb1f92-host" (OuterVolumeSpecName: "host") pod "d04d915b-b7fb-4d05-b341-acc770eb1f92" (UID: "d04d915b-b7fb-4d05-b341-acc770eb1f92"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.483174 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d04d915b-b7fb-4d05-b341-acc770eb1f92-host\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.500362 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04d915b-b7fb-4d05-b341-acc770eb1f92-kube-api-access-fksd8" (OuterVolumeSpecName: "kube-api-access-fksd8") pod "d04d915b-b7fb-4d05-b341-acc770eb1f92" (UID: "d04d915b-b7fb-4d05-b341-acc770eb1f92"). InnerVolumeSpecName "kube-api-access-fksd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.584684 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fksd8\" (UniqueName: \"kubernetes.io/projected/d04d915b-b7fb-4d05-b341-acc770eb1f92-kube-api-access-fksd8\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:38 crc kubenswrapper[4837]: I1014 14:00:38.794628 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04d915b-b7fb-4d05-b341-acc770eb1f92" path="/var/lib/kubelet/pods/d04d915b-b7fb-4d05-b341-acc770eb1f92/volumes" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.302357 4837 scope.go:117] "RemoveContainer" containerID="eac1fb23e1f32330e4c73bbb3d3e3fd64011b059a64d3136f58670dbd2a19e3c" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.302427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-pxpm9" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.682682 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-96xbk/crc-debug-p4rcr"] Oct 14 14:00:39 crc kubenswrapper[4837]: E1014 14:00:39.683545 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" containerName="collect-profiles" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.683563 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" containerName="collect-profiles" Oct 14 14:00:39 crc kubenswrapper[4837]: E1014 14:00:39.683580 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04d915b-b7fb-4d05-b341-acc770eb1f92" containerName="container-00" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.683589 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04d915b-b7fb-4d05-b341-acc770eb1f92" containerName="container-00" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.683861 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04d915b-b7fb-4d05-b341-acc770eb1f92" containerName="container-00" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.683901 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca5bb57-a0c3-43c4-8458-ce0607d6df2d" containerName="collect-profiles" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.684618 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.804817 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d603c0c-831b-4983-8973-4d53a7946ebb-host\") pod \"crc-debug-p4rcr\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.805030 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldg8s\" (UniqueName: \"kubernetes.io/projected/0d603c0c-831b-4983-8973-4d53a7946ebb-kube-api-access-ldg8s\") pod \"crc-debug-p4rcr\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.907299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d603c0c-831b-4983-8973-4d53a7946ebb-host\") pod \"crc-debug-p4rcr\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.907423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldg8s\" (UniqueName: \"kubernetes.io/projected/0d603c0c-831b-4983-8973-4d53a7946ebb-kube-api-access-ldg8s\") pod \"crc-debug-p4rcr\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.907472 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d603c0c-831b-4983-8973-4d53a7946ebb-host\") pod \"crc-debug-p4rcr\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:39 crc kubenswrapper[4837]: I1014 14:00:39.925997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldg8s\" (UniqueName: \"kubernetes.io/projected/0d603c0c-831b-4983-8973-4d53a7946ebb-kube-api-access-ldg8s\") pod \"crc-debug-p4rcr\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:40 crc kubenswrapper[4837]: I1014 14:00:40.001978 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:40 crc kubenswrapper[4837]: I1014 14:00:40.320140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-p4rcr" event={"ID":"0d603c0c-831b-4983-8973-4d53a7946ebb","Type":"ContainerStarted","Data":"525c264dd451fcb7fbe268656595d96afdd01396d615f7b3260fdb18b4fad366"} Oct 14 14:00:40 crc kubenswrapper[4837]: I1014 14:00:40.320437 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-p4rcr" event={"ID":"0d603c0c-831b-4983-8973-4d53a7946ebb","Type":"ContainerStarted","Data":"98b0d34e21102c2d5f37de66b678e67079312d5de61925acff0d1a1aeb4358be"} Oct 14 14:00:40 crc kubenswrapper[4837]: I1014 14:00:40.723397 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-96xbk/crc-debug-p4rcr"] Oct 14 14:00:40 crc kubenswrapper[4837]: I1014 14:00:40.731919 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-96xbk/crc-debug-p4rcr"] Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.139747 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.139835 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.139892 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.140738 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.140842 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" gracePeriod=600 Oct 14 14:00:41 crc kubenswrapper[4837]: E1014 14:00:41.286432 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.340957 4837 generic.go:334] "Generic (PLEG): container finished" podID="0d603c0c-831b-4983-8973-4d53a7946ebb" containerID="525c264dd451fcb7fbe268656595d96afdd01396d615f7b3260fdb18b4fad366" exitCode=0 Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.344869 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" exitCode=0 Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.345003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb"} Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.345103 4837 scope.go:117] "RemoveContainer" containerID="54035085edd5f91a9f13e042f6da48247351b7f4344991c2baca4736f653e46f" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.345868 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:00:41 crc kubenswrapper[4837]: E1014 14:00:41.346305 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.433872 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.535034 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldg8s\" (UniqueName: \"kubernetes.io/projected/0d603c0c-831b-4983-8973-4d53a7946ebb-kube-api-access-ldg8s\") pod \"0d603c0c-831b-4983-8973-4d53a7946ebb\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.535141 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d603c0c-831b-4983-8973-4d53a7946ebb-host\") pod \"0d603c0c-831b-4983-8973-4d53a7946ebb\" (UID: \"0d603c0c-831b-4983-8973-4d53a7946ebb\") " Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.536857 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d603c0c-831b-4983-8973-4d53a7946ebb-host" (OuterVolumeSpecName: "host") pod "0d603c0c-831b-4983-8973-4d53a7946ebb" (UID: "0d603c0c-831b-4983-8973-4d53a7946ebb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.547409 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d603c0c-831b-4983-8973-4d53a7946ebb-kube-api-access-ldg8s" (OuterVolumeSpecName: "kube-api-access-ldg8s") pod "0d603c0c-831b-4983-8973-4d53a7946ebb" (UID: "0d603c0c-831b-4983-8973-4d53a7946ebb"). InnerVolumeSpecName "kube-api-access-ldg8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.637725 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldg8s\" (UniqueName: \"kubernetes.io/projected/0d603c0c-831b-4983-8973-4d53a7946ebb-kube-api-access-ldg8s\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.637961 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d603c0c-831b-4983-8973-4d53a7946ebb-host\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.925006 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-96xbk/crc-debug-sldqt"] Oct 14 14:00:41 crc kubenswrapper[4837]: E1014 14:00:41.925427 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d603c0c-831b-4983-8973-4d53a7946ebb" containerName="container-00" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.925442 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d603c0c-831b-4983-8973-4d53a7946ebb" containerName="container-00" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.925625 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d603c0c-831b-4983-8973-4d53a7946ebb" containerName="container-00" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.926257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.943537 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh4kb\" (UniqueName: \"kubernetes.io/projected/6307324b-c465-4f6a-a102-126da285db29-kube-api-access-nh4kb\") pod \"crc-debug-sldqt\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:41 crc kubenswrapper[4837]: I1014 14:00:41.943671 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6307324b-c465-4f6a-a102-126da285db29-host\") pod \"crc-debug-sldqt\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.044785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6307324b-c465-4f6a-a102-126da285db29-host\") pod \"crc-debug-sldqt\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.044941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6307324b-c465-4f6a-a102-126da285db29-host\") pod \"crc-debug-sldqt\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.044959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh4kb\" (UniqueName: \"kubernetes.io/projected/6307324b-c465-4f6a-a102-126da285db29-kube-api-access-nh4kb\") pod \"crc-debug-sldqt\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.080355 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh4kb\" (UniqueName: \"kubernetes.io/projected/6307324b-c465-4f6a-a102-126da285db29-kube-api-access-nh4kb\") pod \"crc-debug-sldqt\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.242393 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:42 crc kubenswrapper[4837]: W1014 14:00:42.275265 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6307324b_c465_4f6a_a102_126da285db29.slice/crio-67c5866ea72aebedaac4eeaf091008273c1436238e919cb72953a6273d431336 WatchSource:0}: Error finding container 67c5866ea72aebedaac4eeaf091008273c1436238e919cb72953a6273d431336: Status 404 returned error can't find the container with id 67c5866ea72aebedaac4eeaf091008273c1436238e919cb72953a6273d431336 Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.361135 4837 scope.go:117] "RemoveContainer" containerID="525c264dd451fcb7fbe268656595d96afdd01396d615f7b3260fdb18b4fad366" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.361707 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-p4rcr" Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.366507 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-sldqt" event={"ID":"6307324b-c465-4f6a-a102-126da285db29","Type":"ContainerStarted","Data":"67c5866ea72aebedaac4eeaf091008273c1436238e919cb72953a6273d431336"} Oct 14 14:00:42 crc kubenswrapper[4837]: I1014 14:00:42.797550 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d603c0c-831b-4983-8973-4d53a7946ebb" path="/var/lib/kubelet/pods/0d603c0c-831b-4983-8973-4d53a7946ebb/volumes" Oct 14 14:00:43 crc kubenswrapper[4837]: I1014 14:00:43.383880 4837 generic.go:334] "Generic (PLEG): container finished" podID="6307324b-c465-4f6a-a102-126da285db29" containerID="93a273cc9d54154e2640551508d1e625bb4c4780fcefbc7021df3759ad59b02b" exitCode=0 Oct 14 14:00:43 crc kubenswrapper[4837]: I1014 14:00:43.383936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/crc-debug-sldqt" event={"ID":"6307324b-c465-4f6a-a102-126da285db29","Type":"ContainerDied","Data":"93a273cc9d54154e2640551508d1e625bb4c4780fcefbc7021df3759ad59b02b"} Oct 14 14:00:43 crc kubenswrapper[4837]: I1014 14:00:43.438719 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-96xbk/crc-debug-sldqt"] Oct 14 14:00:43 crc kubenswrapper[4837]: I1014 14:00:43.460114 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-96xbk/crc-debug-sldqt"] Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.500477 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.590521 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6307324b-c465-4f6a-a102-126da285db29-host\") pod \"6307324b-c465-4f6a-a102-126da285db29\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.590618 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh4kb\" (UniqueName: \"kubernetes.io/projected/6307324b-c465-4f6a-a102-126da285db29-kube-api-access-nh4kb\") pod \"6307324b-c465-4f6a-a102-126da285db29\" (UID: \"6307324b-c465-4f6a-a102-126da285db29\") " Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.590644 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6307324b-c465-4f6a-a102-126da285db29-host" (OuterVolumeSpecName: "host") pod "6307324b-c465-4f6a-a102-126da285db29" (UID: "6307324b-c465-4f6a-a102-126da285db29"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.591445 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6307324b-c465-4f6a-a102-126da285db29-host\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.594964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6307324b-c465-4f6a-a102-126da285db29-kube-api-access-nh4kb" (OuterVolumeSpecName: "kube-api-access-nh4kb") pod "6307324b-c465-4f6a-a102-126da285db29" (UID: "6307324b-c465-4f6a-a102-126da285db29"). InnerVolumeSpecName "kube-api-access-nh4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.692695 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh4kb\" (UniqueName: \"kubernetes.io/projected/6307324b-c465-4f6a-a102-126da285db29-kube-api-access-nh4kb\") on node \"crc\" DevicePath \"\"" Oct 14 14:00:44 crc kubenswrapper[4837]: I1014 14:00:44.804092 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6307324b-c465-4f6a-a102-126da285db29" path="/var/lib/kubelet/pods/6307324b-c465-4f6a-a102-126da285db29/volumes" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.401018 4837 scope.go:117] "RemoveContainer" containerID="93a273cc9d54154e2640551508d1e625bb4c4780fcefbc7021df3759ad59b02b" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.401051 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/crc-debug-sldqt" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.517447 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5799b74b9d-p594h_7cb7fa99-fe9e-4e56-a3ef-26c6ad271530/barbican-api/0.log" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.600458 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5799b74b9d-p594h_7cb7fa99-fe9e-4e56-a3ef-26c6ad271530/barbican-api-log/0.log" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.695930 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5dd7b6957d-hqts4_e52b1001-3fb0-415b-be6a-e55a548462ac/barbican-keystone-listener/0.log" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.749787 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5dd7b6957d-hqts4_e52b1001-3fb0-415b-be6a-e55a548462ac/barbican-keystone-listener-log/0.log" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.848944 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-64dbf95879-s4jqv_d42a5890-7561-4b99-9518-0c6c672217d9/barbican-worker/0.log" Oct 14 14:00:45 crc kubenswrapper[4837]: I1014 14:00:45.990045 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-64dbf95879-s4jqv_d42a5890-7561-4b99-9518-0c6c672217d9/barbican-worker-log/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.067407 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b_6217fcbf-8651-4d63-b670-71de72f5feed/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.184874 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/ceilometer-central-agent/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.209046 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/ceilometer-notification-agent/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.253502 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/sg-core/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.286204 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/proxy-httpd/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.420294 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_92e412ce-d61d-4c7f-8297-ce2cc5011325/cinder-api-log/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.431535 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_92e412ce-d61d-4c7f-8297-ce2cc5011325/cinder-api/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.627756 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_aa0d5e89-66d3-4f22-9704-7c3c35ee537f/cinder-scheduler/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.634211 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_aa0d5e89-66d3-4f22-9704-7c3c35ee537f/probe/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.715430 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp_b523b3d5-ba31-4620-8287-055d6bc931cc/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.817530 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd_b030d75a-71e0-41af-9ab0-298924d1a955/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:46 crc kubenswrapper[4837]: I1014 14:00:46.917704 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m9npl_1febddb2-b222-433e-b8bc-47a3956bc38d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.032724 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-c696x_4a6f65cd-fd19-4b6a-9dee-4ef117beb86f/init/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.172121 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-c696x_4a6f65cd-fd19-4b6a-9dee-4ef117beb86f/init/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.209231 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-c696x_4a6f65cd-fd19-4b6a-9dee-4ef117beb86f/dnsmasq-dns/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.280646 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-shsmx_714cea27-46ab-4d03-b5d1-81b42d99f6f6/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.406587 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a6304802-caa4-4ed2-a570-fc09f7c940b5/glance-log/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.422087 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a6304802-caa4-4ed2-a570-fc09f7c940b5/glance-httpd/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.566749 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d215f9ee-cdfe-47a1-8240-e74f9f81f97d/glance-httpd/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.606931 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d215f9ee-cdfe-47a1-8240-e74f9f81f97d/glance-log/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.776451 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b48ff9644-mb62f_0d3a61c6-2a73-409f-b296-10f7a19685d6/horizon/0.log" Oct 14 14:00:47 crc kubenswrapper[4837]: I1014 14:00:47.904040 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd_97b10dd3-253f-47fa-ad50-4765f7139f4f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.101751 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hr9tc_12398715-a536-446f-81aa-00aa7b0546ed/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.124695 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b48ff9644-mb62f_0d3a61c6-2a73-409f-b296-10f7a19685d6/horizon-log/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.302517 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b20b7196-1920-4b12-a38e-7356ca4dc4e2/kube-state-metrics/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.398543 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7bcd589b8f-ljfsq_9dd1fb1b-4520-43c9-8a24-fd0a225856a3/keystone-api/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.601804 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-78vdf_aacf282f-f2c7-447d-9e73-98a35898f8df/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.920757 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f7c5db7df-7tsqg_a7d3bc97-ce39-472b-860e-79b620b726f1/neutron-httpd/0.log" Oct 14 14:00:48 crc kubenswrapper[4837]: I1014 14:00:48.978711 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f7c5db7df-7tsqg_a7d3bc97-ce39-472b-860e-79b620b726f1/neutron-api/0.log" Oct 14 14:00:49 crc kubenswrapper[4837]: I1014 14:00:49.111587 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h_e0186e2a-7938-4646-ba9c-768d75c09605/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:49 crc kubenswrapper[4837]: I1014 14:00:49.554303 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bd40b584-fe33-48fa-a09b-e50f7b40f785/nova-cell0-conductor-conductor/0.log" Oct 14 14:00:49 crc kubenswrapper[4837]: I1014 14:00:49.586984 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9a329d7-874c-4b64-b23e-10463d345068/nova-api-log/0.log" Oct 14 14:00:49 crc kubenswrapper[4837]: I1014 14:00:49.741361 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9a329d7-874c-4b64-b23e-10463d345068/nova-api-api/0.log" Oct 14 14:00:49 crc kubenswrapper[4837]: I1014 14:00:49.883608 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c2118184-3d60-4d7a-b203-961341c9be78/nova-cell1-conductor-conductor/0.log" Oct 14 14:00:49 crc kubenswrapper[4837]: I1014 14:00:49.920860 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_abb35cf2-796d-40bc-8b6b-d421dec44645/nova-cell1-novncproxy-novncproxy/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.091823 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9v67f_51ebf601-fdd4-46d5-b68e-97846a7baff5/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.183397 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2db56d67-c528-47cc-8569-d9636ebd2667/nova-metadata-log/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.465855 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ebe0b001-1902-4166-a8a3-b3d0c54139f4/nova-scheduler-scheduler/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.612835 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11ead61d-f315-4ee0-9dcb-a222012a9c36/mysql-bootstrap/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.770866 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11ead61d-f315-4ee0-9dcb-a222012a9c36/galera/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.809682 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11ead61d-f315-4ee0-9dcb-a222012a9c36/mysql-bootstrap/0.log" Oct 14 14:00:50 crc kubenswrapper[4837]: I1014 14:00:50.962171 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_162a8777-0979-4087-959a-98cd20678758/mysql-bootstrap/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.177932 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_162a8777-0979-4087-959a-98cd20678758/mysql-bootstrap/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.178302 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_162a8777-0979-4087-959a-98cd20678758/galera/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.333930 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2db56d67-c528-47cc-8569-d9636ebd2667/nova-metadata-metadata/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.391072 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_75fffdca-61c2-4af0-a87d-1662358aa171/openstackclient/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.481061 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-j4tpc_14f970e0-8d42-46d6-937a-c39f521f6bea/ovn-controller/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.609248 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vj9cc_43da7026-edd3-4f7f-9944-1aff537446a0/openstack-network-exporter/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.696976 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovsdb-server-init/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.885126 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovsdb-server/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.914133 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovsdb-server-init/0.log" Oct 14 14:00:51 crc kubenswrapper[4837]: I1014 14:00:51.915214 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovs-vswitchd/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.153836 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hc8z6_6f5e2181-b922-48d6-909c-ad1f87fee631/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.161376 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f648709-678d-4844-8571-ac5c5c5712a3/openstack-network-exporter/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.277147 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f648709-678d-4844-8571-ac5c5c5712a3/ovn-northd/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.408624 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cc544b19-4b52-46ca-9c0b-518f78ebb47b/ovsdbserver-nb/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.411774 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cc544b19-4b52-46ca-9c0b-518f78ebb47b/openstack-network-exporter/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.547819 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7f5a204e-7b4c-41c2-8d69-e93d3c986249/openstack-network-exporter/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.577228 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7f5a204e-7b4c-41c2-8d69-e93d3c986249/ovsdbserver-sb/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.794672 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:00:52 crc kubenswrapper[4837]: E1014 14:00:52.794914 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.887215 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67c99f9644-lpk76_3be94ea9-34d4-4765-92cf-93345cfb88bb/placement-api/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.921375 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67c99f9644-lpk76_3be94ea9-34d4-4765-92cf-93345cfb88bb/placement-log/0.log" Oct 14 14:00:52 crc kubenswrapper[4837]: I1014 14:00:52.941910 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7edbbd-c98f-4800-a4ae-49ea0de7f12d/setup-container/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.210823 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7edbbd-c98f-4800-a4ae-49ea0de7f12d/setup-container/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.253962 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7edbbd-c98f-4800-a4ae-49ea0de7f12d/rabbitmq/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.270386 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4e21425-fc2a-487e-bb81-615828fd727f/setup-container/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.419290 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4e21425-fc2a-487e-bb81-615828fd727f/setup-container/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.456333 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4e21425-fc2a-487e-bb81-615828fd727f/rabbitmq/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.548586 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj_611b04f3-d9fa-4841-8cd5-608c99279890/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.669131 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ggmtq_d5e9e4de-2bda-45cb-a580-b89e8dee024e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.804505 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv_8fbd0386-a3fe-4ad1-8b44-0945dd47a255/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:53 crc kubenswrapper[4837]: I1014 14:00:53.900973 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2nnnt_ddd3587a-7e10-4ad8-90bf-c172acc6e635/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.043720 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m8c7v_805056c6-9ce3-4dcf-852d-2a71b8627f80/ssh-known-hosts-edpm-deployment/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.320411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-68b7b9db59-mdpgm_7ac8f443-1071-49d6-94d2-e7fea6f09cc5/proxy-httpd/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.321280 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-68b7b9db59-mdpgm_7ac8f443-1071-49d6-94d2-e7fea6f09cc5/proxy-server/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.425198 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-zww9d_1960b9c9-0169-447b-a184-21c3522760f8/swift-ring-rebalance/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.529063 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-auditor/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.608381 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-reaper/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.683223 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-replicator/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.752454 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-server/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.773384 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-auditor/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.852593 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-replicator/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.882357 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-server/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.938018 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-updater/0.log" Oct 14 14:00:54 crc kubenswrapper[4837]: I1014 14:00:54.993678 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-auditor/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.058539 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-expirer/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.082576 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-replicator/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.197682 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-server/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.226689 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-updater/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.266285 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/rsync/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.270988 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/swift-recon-cron/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.470313 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn_aa5f8d90-d124-49cd-ac34-f24b91f0a457/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.664242 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5726ac57-50cd-4985-a6cd-86a9683bb283/test-operator-logs-container/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.720675 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_beac4f98-00d6-438b-86cc-2f85d2ca1f96/tempest-tests-tempest-tests-runner/0.log" Oct 14 14:00:55 crc kubenswrapper[4837]: I1014 14:00:55.885963 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-qxb82_6fa36834-4501-43b2-8084-2c79052f5185/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.148753 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29340841-2cmzh"] Oct 14 14:01:00 crc kubenswrapper[4837]: E1014 14:01:00.149551 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6307324b-c465-4f6a-a102-126da285db29" containerName="container-00" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.149568 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6307324b-c465-4f6a-a102-126da285db29" containerName="container-00" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.149834 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6307324b-c465-4f6a-a102-126da285db29" containerName="container-00" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.150668 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.166137 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29340841-2cmzh"] Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.265804 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftf4m\" (UniqueName: \"kubernetes.io/projected/9e98c493-7d92-4165-918b-44dace3ca02a-kube-api-access-ftf4m\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.265877 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-combined-ca-bundle\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.265900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-config-data\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.265950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-fernet-keys\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.367005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftf4m\" (UniqueName: \"kubernetes.io/projected/9e98c493-7d92-4165-918b-44dace3ca02a-kube-api-access-ftf4m\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.367265 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-combined-ca-bundle\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.367430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-config-data\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.367532 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-fernet-keys\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.373205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-config-data\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.375551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-fernet-keys\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.378901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-combined-ca-bundle\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.383304 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftf4m\" (UniqueName: \"kubernetes.io/projected/9e98c493-7d92-4165-918b-44dace3ca02a-kube-api-access-ftf4m\") pod \"keystone-cron-29340841-2cmzh\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.476378 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:00 crc kubenswrapper[4837]: I1014 14:01:00.955258 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29340841-2cmzh"] Oct 14 14:01:01 crc kubenswrapper[4837]: I1014 14:01:01.571557 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29340841-2cmzh" event={"ID":"9e98c493-7d92-4165-918b-44dace3ca02a","Type":"ContainerStarted","Data":"745fa30cd0de4ace3ce6a2fd7d139dfa5ba4ea3ea8f952742e5e9231632fd0d0"} Oct 14 14:01:01 crc kubenswrapper[4837]: I1014 14:01:01.571846 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29340841-2cmzh" event={"ID":"9e98c493-7d92-4165-918b-44dace3ca02a","Type":"ContainerStarted","Data":"ee1aaf59d4b201b700afdaf62a3e4d63c14ab9833011e5dfc82d8838f377cc4c"} Oct 14 14:01:01 crc kubenswrapper[4837]: I1014 14:01:01.596754 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29340841-2cmzh" podStartSLOduration=1.596731755 podStartE2EDuration="1.596731755s" podCreationTimestamp="2025-10-14 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 14:01:01.584822853 +0000 UTC m=+3599.501822666" watchObservedRunningTime="2025-10-14 14:01:01.596731755 +0000 UTC m=+3599.513731578" Oct 14 14:01:03 crc kubenswrapper[4837]: I1014 14:01:03.594499 4837 generic.go:334] "Generic (PLEG): container finished" podID="9e98c493-7d92-4165-918b-44dace3ca02a" containerID="745fa30cd0de4ace3ce6a2fd7d139dfa5ba4ea3ea8f952742e5e9231632fd0d0" exitCode=0 Oct 14 14:01:03 crc kubenswrapper[4837]: I1014 14:01:03.594576 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29340841-2cmzh" event={"ID":"9e98c493-7d92-4165-918b-44dace3ca02a","Type":"ContainerDied","Data":"745fa30cd0de4ace3ce6a2fd7d139dfa5ba4ea3ea8f952742e5e9231632fd0d0"} Oct 14 14:01:04 crc kubenswrapper[4837]: I1014 14:01:04.989805 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.044735 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftf4m\" (UniqueName: \"kubernetes.io/projected/9e98c493-7d92-4165-918b-44dace3ca02a-kube-api-access-ftf4m\") pod \"9e98c493-7d92-4165-918b-44dace3ca02a\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.044912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-combined-ca-bundle\") pod \"9e98c493-7d92-4165-918b-44dace3ca02a\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.044953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-config-data\") pod \"9e98c493-7d92-4165-918b-44dace3ca02a\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.044973 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-fernet-keys\") pod \"9e98c493-7d92-4165-918b-44dace3ca02a\" (UID: \"9e98c493-7d92-4165-918b-44dace3ca02a\") " Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.059957 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e98c493-7d92-4165-918b-44dace3ca02a-kube-api-access-ftf4m" (OuterVolumeSpecName: "kube-api-access-ftf4m") pod "9e98c493-7d92-4165-918b-44dace3ca02a" (UID: "9e98c493-7d92-4165-918b-44dace3ca02a"). InnerVolumeSpecName "kube-api-access-ftf4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.067919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9e98c493-7d92-4165-918b-44dace3ca02a" (UID: "9e98c493-7d92-4165-918b-44dace3ca02a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.081452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e98c493-7d92-4165-918b-44dace3ca02a" (UID: "9e98c493-7d92-4165-918b-44dace3ca02a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.146762 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.146790 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.146799 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftf4m\" (UniqueName: \"kubernetes.io/projected/9e98c493-7d92-4165-918b-44dace3ca02a-kube-api-access-ftf4m\") on node \"crc\" DevicePath \"\"" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.159336 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-config-data" (OuterVolumeSpecName: "config-data") pod "9e98c493-7d92-4165-918b-44dace3ca02a" (UID: "9e98c493-7d92-4165-918b-44dace3ca02a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.248108 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98c493-7d92-4165-918b-44dace3ca02a-config-data\") on node \"crc\" DevicePath \"\"" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.612661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29340841-2cmzh" event={"ID":"9e98c493-7d92-4165-918b-44dace3ca02a","Type":"ContainerDied","Data":"ee1aaf59d4b201b700afdaf62a3e4d63c14ab9833011e5dfc82d8838f377cc4c"} Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.612926 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee1aaf59d4b201b700afdaf62a3e4d63c14ab9833011e5dfc82d8838f377cc4c" Oct 14 14:01:05 crc kubenswrapper[4837]: I1014 14:01:05.612995 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29340841-2cmzh" Oct 14 14:01:06 crc kubenswrapper[4837]: I1014 14:01:06.190633 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0454524b-c83b-4049-ad05-8b29a317bc91/memcached/0.log" Oct 14 14:01:06 crc kubenswrapper[4837]: I1014 14:01:06.785266 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:01:06 crc kubenswrapper[4837]: E1014 14:01:06.785571 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:01:17 crc kubenswrapper[4837]: I1014 14:01:17.785114 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:01:17 crc kubenswrapper[4837]: E1014 14:01:17.785900 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:01:18 crc kubenswrapper[4837]: I1014 14:01:18.889770 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/util/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.091846 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/pull/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.099755 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/util/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.113546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/pull/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.281011 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/pull/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.312056 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/extract/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.318193 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/util/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.430677 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-49s6c_e1d5f52e-4c67-4242-bea3-6eef9fb72623/kube-rbac-proxy/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.547946 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-49s6c_e1d5f52e-4c67-4242-bea3-6eef9fb72623/manager/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.560371 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5f627_53370b8e-db35-4a50-af38-f24ac2fad459/kube-rbac-proxy/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.656008 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5f627_53370b8e-db35-4a50-af38-f24ac2fad459/manager/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.731758 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-jczn9_37e6419b-1647-43e2-89ef-67deae94e8b3/manager/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.732313 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-jczn9_37e6419b-1647-43e2-89ef-67deae94e8b3/kube-rbac-proxy/0.log" Oct 14 14:01:19 crc kubenswrapper[4837]: I1014 14:01:19.908813 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-q9dmc_f915ddfd-5160-4f57-85a8-9b5fe02c1908/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.026286 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-q9dmc_f915ddfd-5160-4f57-85a8-9b5fe02c1908/manager/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.063083 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qkd6g_cfce54d9-39e9-4b1f-bb95-11d72de2cbdc/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.090343 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qkd6g_cfce54d9-39e9-4b1f-bb95-11d72de2cbdc/manager/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.199711 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-9hctf_f7815a82-8a77-47a1-8a07-966eb6340b2b/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.225210 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-9hctf_f7815a82-8a77-47a1-8a07-966eb6340b2b/manager/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.336737 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-7xv4c_62e66325-7f63-4815-9f2d-fafbd138fa4e/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.528466 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-7xv4c_62e66325-7f63-4815-9f2d-fafbd138fa4e/manager/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.534265 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-n6pcv_32cb1840-83d3-40ec-859a-15391e369bde/manager/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.539432 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-n6pcv_32cb1840-83d3-40ec-859a-15391e369bde/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.700942 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-fq8p9_4f4fbd70-1ccf-4509-8552-ab902e8e7a0f/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.777661 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-fq8p9_4f4fbd70-1ccf-4509-8552-ab902e8e7a0f/manager/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.855010 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-87zsz_62be7f3d-ddbe-4470-ace0-0907330b09ac/kube-rbac-proxy/0.log" Oct 14 14:01:20 crc kubenswrapper[4837]: I1014 14:01:20.908587 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-87zsz_62be7f3d-ddbe-4470-ace0-0907330b09ac/manager/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.003743 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-mdzbv_e4f5b829-46e0-4048-9b51-1a9256375d4f/kube-rbac-proxy/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.032475 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-mdzbv_e4f5b829-46e0-4048-9b51-1a9256375d4f/manager/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.143359 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-b2hkp_7f62a453-6fb4-4769-a2ef-da03024d8e90/kube-rbac-proxy/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.203693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-b2hkp_7f62a453-6fb4-4769-a2ef-da03024d8e90/manager/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.328521 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-hx2m5_4d12bc33-de6d-405c-b539-72ab956b4234/kube-rbac-proxy/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.410940 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-hx2m5_4d12bc33-de6d-405c-b539-72ab956b4234/manager/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.457512 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-6w7km_fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e/kube-rbac-proxy/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.536056 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-6w7km_fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e/manager/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.589037 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b_922d6301-937e-403a-ade6-06620798c61c/kube-rbac-proxy/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.649297 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b_922d6301-937e-403a-ade6-06620798c61c/manager/0.log" Oct 14 14:01:21 crc kubenswrapper[4837]: I1014 14:01:21.763178 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-84c49f8869-sxmsq_293d5905-c149-4fe1-a09d-204cc4cff4e6/kube-rbac-proxy/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.009077 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f9b497985-b8x95_6092738b-995d-48bd-a9a7-0c5b4caebea9/kube-rbac-proxy/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.127237 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f9b497985-b8x95_6092738b-995d-48bd-a9a7-0c5b4caebea9/operator/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.199396 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qq6lf_aafb3bab-e32a-4523-8b72-b3131408a0be/registry-server/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.414080 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-mlhgx_1ac92ea3-d385-42f1-bc27-59a93f495cbc/kube-rbac-proxy/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.487978 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-mlhgx_1ac92ea3-d385-42f1-bc27-59a93f495cbc/manager/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.562172 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-jkjjd_a086b7d2-5401-4754-9825-2425a3a2aa22/kube-rbac-proxy/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.665565 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-jkjjd_a086b7d2-5401-4754-9825-2425a3a2aa22/manager/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.780367 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs_be28404e-866e-4ffd-8cfc-a43090217244/operator/0.log" Oct 14 14:01:22 crc kubenswrapper[4837]: I1014 14:01:22.949246 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-84c49f8869-sxmsq_293d5905-c149-4fe1-a09d-204cc4cff4e6/manager/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.013229 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-t8t4r_640618dc-c509-410b-9669-9b77a1f8d068/manager/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.030693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-t8t4r_640618dc-c509-410b-9669-9b77a1f8d068/kube-rbac-proxy/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.138520 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-78pkj_7d205182-3314-4282-800d-4dc57b64f416/kube-rbac-proxy/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.205445 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-78pkj_7d205182-3314-4282-800d-4dc57b64f416/manager/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.214621 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mhs2q_c2182e6f-c24c-4164-a269-4c11d34057a7/kube-rbac-proxy/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.237245 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mhs2q_c2182e6f-c24c-4164-a269-4c11d34057a7/manager/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.375903 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-rfdvw_6017e7af-9d95-42c3-9f9c-bbd3df49f4f4/kube-rbac-proxy/0.log" Oct 14 14:01:23 crc kubenswrapper[4837]: I1014 14:01:23.393447 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-rfdvw_6017e7af-9d95-42c3-9f9c-bbd3df49f4f4/manager/0.log" Oct 14 14:01:31 crc kubenswrapper[4837]: I1014 14:01:31.784239 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:01:31 crc kubenswrapper[4837]: E1014 14:01:31.785063 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:01:38 crc kubenswrapper[4837]: I1014 14:01:38.480543 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wzp95_1577b547-7e30-4b8e-9959-fdd88088041c/control-plane-machine-set-operator/0.log" Oct 14 14:01:38 crc kubenswrapper[4837]: I1014 14:01:38.663566 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dfjx8_675483c3-eb80-41b4-b02b-db9059ec788b/kube-rbac-proxy/0.log" Oct 14 14:01:38 crc kubenswrapper[4837]: I1014 14:01:38.663600 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dfjx8_675483c3-eb80-41b4-b02b-db9059ec788b/machine-api-operator/0.log" Oct 14 14:01:42 crc kubenswrapper[4837]: I1014 14:01:42.793739 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:01:42 crc kubenswrapper[4837]: E1014 14:01:42.794441 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:01:50 crc kubenswrapper[4837]: I1014 14:01:50.102006 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-65869_f782d810-8b08-4a07-b024-0481a26cf944/cert-manager-controller/0.log" Oct 14 14:01:50 crc kubenswrapper[4837]: I1014 14:01:50.304470 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-nqrzh_ffee16ce-49f5-418a-b83c-64b60165f84e/cert-manager-cainjector/0.log" Oct 14 14:01:50 crc kubenswrapper[4837]: I1014 14:01:50.322719 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zktdz_ca647993-67e2-4c73-b529-68deed403e7f/cert-manager-webhook/0.log" Oct 14 14:01:55 crc kubenswrapper[4837]: I1014 14:01:55.785088 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:01:55 crc kubenswrapper[4837]: E1014 14:01:55.785963 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:02:01 crc kubenswrapper[4837]: I1014 14:02:01.695870 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-5jz9k_8e30fca9-8930-4438-baeb-6cd8437d808e/nmstate-console-plugin/0.log" Oct 14 14:02:01 crc kubenswrapper[4837]: I1014 14:02:01.896482 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zpg5h_f596383f-8fd9-42cc-9554-8cfac0f1cbeb/nmstate-handler/0.log" Oct 14 14:02:01 crc kubenswrapper[4837]: I1014 14:02:01.937062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2fxhs_fe630318-04d6-4ba7-98d4-004f61f9e801/kube-rbac-proxy/0.log" Oct 14 14:02:01 crc kubenswrapper[4837]: I1014 14:02:01.962537 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2fxhs_fe630318-04d6-4ba7-98d4-004f61f9e801/nmstate-metrics/0.log" Oct 14 14:02:02 crc kubenswrapper[4837]: I1014 14:02:02.093697 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-qmtct_a0b26320-e880-47dc-8ead-5b4547870db1/nmstate-operator/0.log" Oct 14 14:02:02 crc kubenswrapper[4837]: I1014 14:02:02.154217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-dktjp_2e3f42bf-7e0b-4969-8b2e-0479072f35a4/nmstate-webhook/0.log" Oct 14 14:02:09 crc kubenswrapper[4837]: I1014 14:02:09.784664 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:02:09 crc kubenswrapper[4837]: E1014 14:02:09.785311 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.130584 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-9c8q6_f744e2d8-9bff-4348-8014-42a4a7a5cc20/kube-rbac-proxy/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.216427 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-9c8q6_f744e2d8-9bff-4348-8014-42a4a7a5cc20/controller/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.333637 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.520532 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.533364 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.537257 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.538966 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.682241 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.711213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.717439 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.718598 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.872258 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.873369 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.912375 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:02:15 crc kubenswrapper[4837]: I1014 14:02:15.931571 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/controller/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.056357 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/frr-metrics/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.098654 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/kube-rbac-proxy/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.137355 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/kube-rbac-proxy-frr/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.245427 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/reloader/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.338645 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-hvtxf_529d2022-65d4-49b1-801d-f14d900cfdf7/frr-k8s-webhook-server/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.514680 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bb79b9dd7-l248c_0c827589-1da4-40cd-967d-4144c014cee8/manager/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.630114 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5788b958cf-vqdk2_58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5/webhook-server/0.log" Oct 14 14:02:16 crc kubenswrapper[4837]: I1014 14:02:16.799359 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tlqsk_f5bb08ae-810b-4b13-a2aa-6ff68721a5a3/kube-rbac-proxy/0.log" Oct 14 14:02:17 crc kubenswrapper[4837]: I1014 14:02:17.302251 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tlqsk_f5bb08ae-810b-4b13-a2aa-6ff68721a5a3/speaker/0.log" Oct 14 14:02:17 crc kubenswrapper[4837]: I1014 14:02:17.513363 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/frr/0.log" Oct 14 14:02:20 crc kubenswrapper[4837]: I1014 14:02:20.784915 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:02:20 crc kubenswrapper[4837]: E1014 14:02:20.785200 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.283660 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/util/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.458874 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/util/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.467398 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/pull/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.504582 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/pull/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.667235 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/util/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.675713 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/extract/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.687134 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/pull/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.801935 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-utilities/0.log" Oct 14 14:02:28 crc kubenswrapper[4837]: I1014 14:02:28.986853 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-content/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.006750 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-content/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.010229 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-utilities/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.162766 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-utilities/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.222220 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-content/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.413038 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-utilities/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.588827 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-utilities/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.696401 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-content/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.696402 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-content/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.874346 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-utilities/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.912422 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-content/0.log" Oct 14 14:02:29 crc kubenswrapper[4837]: I1014 14:02:29.918181 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/registry-server/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.109589 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/util/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.421067 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/pull/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.432475 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/util/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.457928 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/pull/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.505444 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/registry-server/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.617559 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/pull/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.624261 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/util/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.649543 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/extract/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.824215 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-t8tjh_cde48aeb-8f47-4fde-a2cb-a95c09051e43/marketplace-operator/0.log" Oct 14 14:02:30 crc kubenswrapper[4837]: I1014 14:02:30.857913 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-utilities/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.016697 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-content/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.023743 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-utilities/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.039749 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-content/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.203406 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-content/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.206562 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-utilities/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.341534 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/registry-server/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.400677 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-utilities/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.577394 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-content/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.598828 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-content/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.604852 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-utilities/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.732836 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-content/0.log" Oct 14 14:02:31 crc kubenswrapper[4837]: I1014 14:02:31.791235 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-utilities/0.log" Oct 14 14:02:32 crc kubenswrapper[4837]: I1014 14:02:32.262726 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/registry-server/0.log" Oct 14 14:02:34 crc kubenswrapper[4837]: I1014 14:02:34.786004 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:02:34 crc kubenswrapper[4837]: E1014 14:02:34.786738 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:02:46 crc kubenswrapper[4837]: I1014 14:02:46.785179 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:02:46 crc kubenswrapper[4837]: E1014 14:02:46.785960 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:03:01 crc kubenswrapper[4837]: I1014 14:03:01.784290 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:03:01 crc kubenswrapper[4837]: E1014 14:03:01.785066 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:03:15 crc kubenswrapper[4837]: I1014 14:03:15.784302 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:03:15 crc kubenswrapper[4837]: E1014 14:03:15.785218 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:03:29 crc kubenswrapper[4837]: I1014 14:03:29.785269 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:03:29 crc kubenswrapper[4837]: E1014 14:03:29.786146 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:03:42 crc kubenswrapper[4837]: I1014 14:03:42.793244 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:03:42 crc kubenswrapper[4837]: E1014 14:03:42.794262 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:03:53 crc kubenswrapper[4837]: I1014 14:03:53.787453 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:03:53 crc kubenswrapper[4837]: E1014 14:03:53.788384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:04:04 crc kubenswrapper[4837]: I1014 14:04:04.784945 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:04:04 crc kubenswrapper[4837]: E1014 14:04:04.785599 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.306971 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nwrnv"] Oct 14 14:04:06 crc kubenswrapper[4837]: E1014 14:04:06.307911 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98c493-7d92-4165-918b-44dace3ca02a" containerName="keystone-cron" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.307932 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98c493-7d92-4165-918b-44dace3ca02a" containerName="keystone-cron" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.308261 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98c493-7d92-4165-918b-44dace3ca02a" containerName="keystone-cron" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.310234 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.322098 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwrnv"] Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.412914 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-utilities\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.413306 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtkhw\" (UniqueName: \"kubernetes.io/projected/3d56e6b8-4127-4b29-aedf-474163cacd81-kube-api-access-gtkhw\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.413409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-catalog-content\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.515412 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-utilities\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.515588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtkhw\" (UniqueName: \"kubernetes.io/projected/3d56e6b8-4127-4b29-aedf-474163cacd81-kube-api-access-gtkhw\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.515626 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-catalog-content\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.515864 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-utilities\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.515923 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-catalog-content\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.540663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtkhw\" (UniqueName: \"kubernetes.io/projected/3d56e6b8-4127-4b29-aedf-474163cacd81-kube-api-access-gtkhw\") pod \"redhat-operators-nwrnv\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:06 crc kubenswrapper[4837]: I1014 14:04:06.632999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:07 crc kubenswrapper[4837]: I1014 14:04:07.158331 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwrnv"] Oct 14 14:04:07 crc kubenswrapper[4837]: I1014 14:04:07.355271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerStarted","Data":"01a1793770d366468f89111c12a653ecd4fe35540a071ae1fb50c808ad95a439"} Oct 14 14:04:08 crc kubenswrapper[4837]: I1014 14:04:08.367265 4837 generic.go:334] "Generic (PLEG): container finished" podID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerID="81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c" exitCode=0 Oct 14 14:04:08 crc kubenswrapper[4837]: I1014 14:04:08.367315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerDied","Data":"81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c"} Oct 14 14:04:09 crc kubenswrapper[4837]: I1014 14:04:09.378750 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerStarted","Data":"a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71"} Oct 14 14:04:11 crc kubenswrapper[4837]: I1014 14:04:11.403946 4837 generic.go:334] "Generic (PLEG): container finished" podID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerID="a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71" exitCode=0 Oct 14 14:04:11 crc kubenswrapper[4837]: I1014 14:04:11.404483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerDied","Data":"a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71"} Oct 14 14:04:12 crc kubenswrapper[4837]: I1014 14:04:12.415859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerStarted","Data":"cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a"} Oct 14 14:04:12 crc kubenswrapper[4837]: I1014 14:04:12.433421 4837 generic.go:334] "Generic (PLEG): container finished" podID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerID="983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2" exitCode=0 Oct 14 14:04:12 crc kubenswrapper[4837]: I1014 14:04:12.433479 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-96xbk/must-gather-hs27l" event={"ID":"19fc4828-1105-4f55-bfd4-9b8bc2b8403b","Type":"ContainerDied","Data":"983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2"} Oct 14 14:04:12 crc kubenswrapper[4837]: I1014 14:04:12.434627 4837 scope.go:117] "RemoveContainer" containerID="983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2" Oct 14 14:04:12 crc kubenswrapper[4837]: I1014 14:04:12.453839 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nwrnv" podStartSLOduration=2.8690617769999998 podStartE2EDuration="6.453821828s" podCreationTimestamp="2025-10-14 14:04:06 +0000 UTC" firstStartedPulling="2025-10-14 14:04:08.370009745 +0000 UTC m=+3786.287009568" lastFinishedPulling="2025-10-14 14:04:11.954769806 +0000 UTC m=+3789.871769619" observedRunningTime="2025-10-14 14:04:12.450694453 +0000 UTC m=+3790.367694286" watchObservedRunningTime="2025-10-14 14:04:12.453821828 +0000 UTC m=+3790.370821641" Oct 14 14:04:12 crc kubenswrapper[4837]: I1014 14:04:12.579700 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-96xbk_must-gather-hs27l_19fc4828-1105-4f55-bfd4-9b8bc2b8403b/gather/0.log" Oct 14 14:04:16 crc kubenswrapper[4837]: I1014 14:04:16.633965 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:16 crc kubenswrapper[4837]: I1014 14:04:16.634679 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:16 crc kubenswrapper[4837]: I1014 14:04:16.681744 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:17 crc kubenswrapper[4837]: I1014 14:04:17.548032 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:17 crc kubenswrapper[4837]: I1014 14:04:17.604751 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwrnv"] Oct 14 14:04:19 crc kubenswrapper[4837]: I1014 14:04:19.490423 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nwrnv" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="registry-server" containerID="cri-o://cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a" gracePeriod=2 Oct 14 14:04:19 crc kubenswrapper[4837]: I1014 14:04:19.785343 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:04:19 crc kubenswrapper[4837]: E1014 14:04:19.785870 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:04:19 crc kubenswrapper[4837]: I1014 14:04:19.973193 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.085091 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtkhw\" (UniqueName: \"kubernetes.io/projected/3d56e6b8-4127-4b29-aedf-474163cacd81-kube-api-access-gtkhw\") pod \"3d56e6b8-4127-4b29-aedf-474163cacd81\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.085258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-catalog-content\") pod \"3d56e6b8-4127-4b29-aedf-474163cacd81\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.085297 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-utilities\") pod \"3d56e6b8-4127-4b29-aedf-474163cacd81\" (UID: \"3d56e6b8-4127-4b29-aedf-474163cacd81\") " Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.086448 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-utilities" (OuterVolumeSpecName: "utilities") pod "3d56e6b8-4127-4b29-aedf-474163cacd81" (UID: "3d56e6b8-4127-4b29-aedf-474163cacd81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.111431 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d56e6b8-4127-4b29-aedf-474163cacd81-kube-api-access-gtkhw" (OuterVolumeSpecName: "kube-api-access-gtkhw") pod "3d56e6b8-4127-4b29-aedf-474163cacd81" (UID: "3d56e6b8-4127-4b29-aedf-474163cacd81"). InnerVolumeSpecName "kube-api-access-gtkhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.121763 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-96xbk/must-gather-hs27l"] Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.121970 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-96xbk/must-gather-hs27l" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="copy" containerID="cri-o://0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1" gracePeriod=2 Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.152170 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-96xbk/must-gather-hs27l"] Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.187262 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtkhw\" (UniqueName: \"kubernetes.io/projected/3d56e6b8-4127-4b29-aedf-474163cacd81-kube-api-access-gtkhw\") on node \"crc\" DevicePath \"\"" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.187299 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.222438 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d56e6b8-4127-4b29-aedf-474163cacd81" (UID: "3d56e6b8-4127-4b29-aedf-474163cacd81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.288921 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d56e6b8-4127-4b29-aedf-474163cacd81-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.448956 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-96xbk_must-gather-hs27l_19fc4828-1105-4f55-bfd4-9b8bc2b8403b/copy/0.log" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.449574 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.500486 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-96xbk_must-gather-hs27l_19fc4828-1105-4f55-bfd4-9b8bc2b8403b/copy/0.log" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.500822 4837 generic.go:334] "Generic (PLEG): container finished" podID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerID="0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1" exitCode=143 Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.500902 4837 scope.go:117] "RemoveContainer" containerID="0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.501043 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-96xbk/must-gather-hs27l" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.507705 4837 generic.go:334] "Generic (PLEG): container finished" podID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerID="cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a" exitCode=0 Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.507784 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwrnv" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.507811 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerDied","Data":"cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a"} Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.507855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwrnv" event={"ID":"3d56e6b8-4127-4b29-aedf-474163cacd81","Type":"ContainerDied","Data":"01a1793770d366468f89111c12a653ecd4fe35540a071ae1fb50c808ad95a439"} Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.523020 4837 scope.go:117] "RemoveContainer" containerID="983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.542194 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwrnv"] Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.550474 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nwrnv"] Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.566476 4837 scope.go:117] "RemoveContainer" containerID="0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1" Oct 14 14:04:20 crc kubenswrapper[4837]: E1014 14:04:20.566876 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1\": container with ID starting with 0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1 not found: ID does not exist" containerID="0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.566906 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1"} err="failed to get container status \"0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1\": rpc error: code = NotFound desc = could not find container \"0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1\": container with ID starting with 0050d73e3dc498a2fa86f4c851c8b4fb41da8b83d8ef8b169cecb7e9ce085ed1 not found: ID does not exist" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.566933 4837 scope.go:117] "RemoveContainer" containerID="983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2" Oct 14 14:04:20 crc kubenswrapper[4837]: E1014 14:04:20.567218 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2\": container with ID starting with 983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2 not found: ID does not exist" containerID="983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.567282 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2"} err="failed to get container status \"983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2\": rpc error: code = NotFound desc = could not find container \"983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2\": container with ID starting with 983dea8883a7ba8dea1bc451f1f4177c909fbc6766f5ecdd150ff5f9c0149db2 not found: ID does not exist" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.567308 4837 scope.go:117] "RemoveContainer" containerID="cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.590751 4837 scope.go:117] "RemoveContainer" containerID="a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.593041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-must-gather-output\") pod \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.593364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz8w7\" (UniqueName: \"kubernetes.io/projected/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-kube-api-access-dz8w7\") pod \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\" (UID: \"19fc4828-1105-4f55-bfd4-9b8bc2b8403b\") " Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.608398 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-kube-api-access-dz8w7" (OuterVolumeSpecName: "kube-api-access-dz8w7") pod "19fc4828-1105-4f55-bfd4-9b8bc2b8403b" (UID: "19fc4828-1105-4f55-bfd4-9b8bc2b8403b"). InnerVolumeSpecName "kube-api-access-dz8w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.695689 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz8w7\" (UniqueName: \"kubernetes.io/projected/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-kube-api-access-dz8w7\") on node \"crc\" DevicePath \"\"" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.712251 4837 scope.go:117] "RemoveContainer" containerID="81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.739787 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "19fc4828-1105-4f55-bfd4-9b8bc2b8403b" (UID: "19fc4828-1105-4f55-bfd4-9b8bc2b8403b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.759133 4837 scope.go:117] "RemoveContainer" containerID="cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a" Oct 14 14:04:20 crc kubenswrapper[4837]: E1014 14:04:20.759736 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a\": container with ID starting with cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a not found: ID does not exist" containerID="cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.759766 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a"} err="failed to get container status \"cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a\": rpc error: code = NotFound desc = could not find container \"cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a\": container with ID starting with cf06904dec45348266ceaccf179cfa1808cea08327a7aeaa51172cb88180099a not found: ID does not exist" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.759785 4837 scope.go:117] "RemoveContainer" containerID="a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71" Oct 14 14:04:20 crc kubenswrapper[4837]: E1014 14:04:20.760093 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71\": container with ID starting with a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71 not found: ID does not exist" containerID="a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.760228 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71"} err="failed to get container status \"a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71\": rpc error: code = NotFound desc = could not find container \"a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71\": container with ID starting with a9c29e395677af80e51a5b4485c2956e472eeb6ed86d22cec11a80681bfd7a71 not found: ID does not exist" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.760245 4837 scope.go:117] "RemoveContainer" containerID="81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c" Oct 14 14:04:20 crc kubenswrapper[4837]: E1014 14:04:20.764631 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c\": container with ID starting with 81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c not found: ID does not exist" containerID="81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.764847 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c"} err="failed to get container status \"81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c\": rpc error: code = NotFound desc = could not find container \"81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c\": container with ID starting with 81f1440af3d01755da6b29c2896a768685c3e74f1a5b628b282fe2dae7a86d2c not found: ID does not exist" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.798400 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/19fc4828-1105-4f55-bfd4-9b8bc2b8403b-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.803986 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" path="/var/lib/kubelet/pods/19fc4828-1105-4f55-bfd4-9b8bc2b8403b/volumes" Oct 14 14:04:20 crc kubenswrapper[4837]: I1014 14:04:20.804603 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" path="/var/lib/kubelet/pods/3d56e6b8-4127-4b29-aedf-474163cacd81/volumes" Oct 14 14:04:31 crc kubenswrapper[4837]: I1014 14:04:31.784360 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:04:31 crc kubenswrapper[4837]: E1014 14:04:31.785077 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:04:43 crc kubenswrapper[4837]: I1014 14:04:43.785303 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:04:43 crc kubenswrapper[4837]: E1014 14:04:43.786310 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:04:55 crc kubenswrapper[4837]: I1014 14:04:55.785367 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:04:55 crc kubenswrapper[4837]: E1014 14:04:55.786494 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:05:07 crc kubenswrapper[4837]: I1014 14:05:07.785143 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:05:07 crc kubenswrapper[4837]: E1014 14:05:07.785970 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.610438 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9bw42/must-gather-s6f72"] Oct 14 14:05:08 crc kubenswrapper[4837]: E1014 14:05:08.611048 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="copy" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611060 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="copy" Oct 14 14:05:08 crc kubenswrapper[4837]: E1014 14:05:08.611076 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="extract-content" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611085 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="extract-content" Oct 14 14:05:08 crc kubenswrapper[4837]: E1014 14:05:08.611099 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="gather" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611104 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="gather" Oct 14 14:05:08 crc kubenswrapper[4837]: E1014 14:05:08.611126 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="registry-server" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611132 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="registry-server" Oct 14 14:05:08 crc kubenswrapper[4837]: E1014 14:05:08.611143 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="extract-utilities" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611150 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="extract-utilities" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611350 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d56e6b8-4127-4b29-aedf-474163cacd81" containerName="registry-server" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611366 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="copy" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.611382 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fc4828-1105-4f55-bfd4-9b8bc2b8403b" containerName="gather" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.612292 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.614364 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9bw42"/"openshift-service-ca.crt" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.619950 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9bw42"/"kube-root-ca.crt" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.639398 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9bw42/must-gather-s6f72"] Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.694786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgwx\" (UniqueName: \"kubernetes.io/projected/a7de80d5-614f-47a2-a149-c036f5e5c1c8-kube-api-access-jrgwx\") pod \"must-gather-s6f72\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.694928 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7de80d5-614f-47a2-a149-c036f5e5c1c8-must-gather-output\") pod \"must-gather-s6f72\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.797961 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgwx\" (UniqueName: \"kubernetes.io/projected/a7de80d5-614f-47a2-a149-c036f5e5c1c8-kube-api-access-jrgwx\") pod \"must-gather-s6f72\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.798089 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7de80d5-614f-47a2-a149-c036f5e5c1c8-must-gather-output\") pod \"must-gather-s6f72\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.798839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7de80d5-614f-47a2-a149-c036f5e5c1c8-must-gather-output\") pod \"must-gather-s6f72\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.817822 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgwx\" (UniqueName: \"kubernetes.io/projected/a7de80d5-614f-47a2-a149-c036f5e5c1c8-kube-api-access-jrgwx\") pod \"must-gather-s6f72\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:08 crc kubenswrapper[4837]: I1014 14:05:08.931107 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:05:09 crc kubenswrapper[4837]: I1014 14:05:09.567849 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9bw42/must-gather-s6f72"] Oct 14 14:05:10 crc kubenswrapper[4837]: I1014 14:05:10.007741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/must-gather-s6f72" event={"ID":"a7de80d5-614f-47a2-a149-c036f5e5c1c8","Type":"ContainerStarted","Data":"ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71"} Oct 14 14:05:10 crc kubenswrapper[4837]: I1014 14:05:10.009215 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/must-gather-s6f72" event={"ID":"a7de80d5-614f-47a2-a149-c036f5e5c1c8","Type":"ContainerStarted","Data":"ec4a74bf8ae3d6ea66e9c3909f31cea5c73a5459c6173d03016c47f4b55783ea"} Oct 14 14:05:11 crc kubenswrapper[4837]: I1014 14:05:11.017120 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/must-gather-s6f72" event={"ID":"a7de80d5-614f-47a2-a149-c036f5e5c1c8","Type":"ContainerStarted","Data":"ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb"} Oct 14 14:05:11 crc kubenswrapper[4837]: I1014 14:05:11.035027 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9bw42/must-gather-s6f72" podStartSLOduration=3.035004214 podStartE2EDuration="3.035004214s" podCreationTimestamp="2025-10-14 14:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 14:05:11.034592593 +0000 UTC m=+3848.951592406" watchObservedRunningTime="2025-10-14 14:05:11.035004214 +0000 UTC m=+3848.952004047" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.430027 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9bw42/crc-debug-wbvfz"] Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.431587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.433484 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9bw42"/"default-dockercfg-87mgl" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.494304 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-host\") pod \"crc-debug-wbvfz\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.494428 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbnn\" (UniqueName: \"kubernetes.io/projected/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-kube-api-access-8mbnn\") pod \"crc-debug-wbvfz\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.596301 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-host\") pod \"crc-debug-wbvfz\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.596445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-host\") pod \"crc-debug-wbvfz\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.596800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbnn\" (UniqueName: \"kubernetes.io/projected/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-kube-api-access-8mbnn\") pod \"crc-debug-wbvfz\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.623771 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbnn\" (UniqueName: \"kubernetes.io/projected/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-kube-api-access-8mbnn\") pod \"crc-debug-wbvfz\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:13 crc kubenswrapper[4837]: I1014 14:05:13.749088 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:14 crc kubenswrapper[4837]: I1014 14:05:14.058552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" event={"ID":"16f3e93e-ac44-470e-9d12-fe5b17d4efe7","Type":"ContainerStarted","Data":"c849382ecbeb5834d3de307d9b68d454afd018f07fc401f2a2009d431b9c38bf"} Oct 14 14:05:15 crc kubenswrapper[4837]: I1014 14:05:15.068953 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" event={"ID":"16f3e93e-ac44-470e-9d12-fe5b17d4efe7","Type":"ContainerStarted","Data":"5b4d6b36041484f4e49a94cb9bf8ac1bc9b4ea5366905b3608d19fff399afc2d"} Oct 14 14:05:15 crc kubenswrapper[4837]: I1014 14:05:15.088480 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" podStartSLOduration=2.088462204 podStartE2EDuration="2.088462204s" podCreationTimestamp="2025-10-14 14:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 14:05:15.082070971 +0000 UTC m=+3852.999070784" watchObservedRunningTime="2025-10-14 14:05:15.088462204 +0000 UTC m=+3853.005462017" Oct 14 14:05:22 crc kubenswrapper[4837]: I1014 14:05:22.796072 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:05:22 crc kubenswrapper[4837]: E1014 14:05:22.796964 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:05:35 crc kubenswrapper[4837]: I1014 14:05:35.784138 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:05:35 crc kubenswrapper[4837]: E1014 14:05:35.784868 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h4ggd_openshift-machine-config-operator(d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" Oct 14 14:05:47 crc kubenswrapper[4837]: I1014 14:05:47.378577 4837 generic.go:334] "Generic (PLEG): container finished" podID="16f3e93e-ac44-470e-9d12-fe5b17d4efe7" containerID="5b4d6b36041484f4e49a94cb9bf8ac1bc9b4ea5366905b3608d19fff399afc2d" exitCode=0 Oct 14 14:05:47 crc kubenswrapper[4837]: I1014 14:05:47.378677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" event={"ID":"16f3e93e-ac44-470e-9d12-fe5b17d4efe7","Type":"ContainerDied","Data":"5b4d6b36041484f4e49a94cb9bf8ac1bc9b4ea5366905b3608d19fff399afc2d"} Oct 14 14:05:47 crc kubenswrapper[4837]: I1014 14:05:47.784561 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.390040 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"7d41c0ab4b0c247f216128b509a252b01626127621aa5db49ecb9deb60bb0056"} Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.513834 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.549282 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9bw42/crc-debug-wbvfz"] Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.563805 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9bw42/crc-debug-wbvfz"] Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.682098 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mbnn\" (UniqueName: \"kubernetes.io/projected/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-kube-api-access-8mbnn\") pod \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.682558 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-host\") pod \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\" (UID: \"16f3e93e-ac44-470e-9d12-fe5b17d4efe7\") " Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.682847 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-host" (OuterVolumeSpecName: "host") pod "16f3e93e-ac44-470e-9d12-fe5b17d4efe7" (UID: "16f3e93e-ac44-470e-9d12-fe5b17d4efe7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.683356 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-host\") on node \"crc\" DevicePath \"\"" Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.689306 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-kube-api-access-8mbnn" (OuterVolumeSpecName: "kube-api-access-8mbnn") pod "16f3e93e-ac44-470e-9d12-fe5b17d4efe7" (UID: "16f3e93e-ac44-470e-9d12-fe5b17d4efe7"). InnerVolumeSpecName "kube-api-access-8mbnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.791199 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mbnn\" (UniqueName: \"kubernetes.io/projected/16f3e93e-ac44-470e-9d12-fe5b17d4efe7-kube-api-access-8mbnn\") on node \"crc\" DevicePath \"\"" Oct 14 14:05:48 crc kubenswrapper[4837]: I1014 14:05:48.801014 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f3e93e-ac44-470e-9d12-fe5b17d4efe7" path="/var/lib/kubelet/pods/16f3e93e-ac44-470e-9d12-fe5b17d4efe7/volumes" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.428145 4837 scope.go:117] "RemoveContainer" containerID="5b4d6b36041484f4e49a94cb9bf8ac1bc9b4ea5366905b3608d19fff399afc2d" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.428399 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-wbvfz" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.693438 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9bw42/crc-debug-9sjg4"] Oct 14 14:05:49 crc kubenswrapper[4837]: E1014 14:05:49.693868 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f3e93e-ac44-470e-9d12-fe5b17d4efe7" containerName="container-00" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.693884 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f3e93e-ac44-470e-9d12-fe5b17d4efe7" containerName="container-00" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.694133 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f3e93e-ac44-470e-9d12-fe5b17d4efe7" containerName="container-00" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.694788 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.697609 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9bw42"/"default-dockercfg-87mgl" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.722405 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfv4h\" (UniqueName: \"kubernetes.io/projected/10a886fb-df51-44e1-88b9-404089da7a25-kube-api-access-pfv4h\") pod \"crc-debug-9sjg4\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.722453 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a886fb-df51-44e1-88b9-404089da7a25-host\") pod \"crc-debug-9sjg4\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.824177 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfv4h\" (UniqueName: \"kubernetes.io/projected/10a886fb-df51-44e1-88b9-404089da7a25-kube-api-access-pfv4h\") pod \"crc-debug-9sjg4\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.824244 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a886fb-df51-44e1-88b9-404089da7a25-host\") pod \"crc-debug-9sjg4\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.824368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a886fb-df51-44e1-88b9-404089da7a25-host\") pod \"crc-debug-9sjg4\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:49 crc kubenswrapper[4837]: I1014 14:05:49.842896 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfv4h\" (UniqueName: \"kubernetes.io/projected/10a886fb-df51-44e1-88b9-404089da7a25-kube-api-access-pfv4h\") pod \"crc-debug-9sjg4\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:50 crc kubenswrapper[4837]: I1014 14:05:50.010969 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:50 crc kubenswrapper[4837]: I1014 14:05:50.446137 4837 generic.go:334] "Generic (PLEG): container finished" podID="10a886fb-df51-44e1-88b9-404089da7a25" containerID="166d61e946e27c3ae9b6dfa5eb662848460ca323ab360763d631583ae186118d" exitCode=0 Oct 14 14:05:50 crc kubenswrapper[4837]: I1014 14:05:50.446229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-9sjg4" event={"ID":"10a886fb-df51-44e1-88b9-404089da7a25","Type":"ContainerDied","Data":"166d61e946e27c3ae9b6dfa5eb662848460ca323ab360763d631583ae186118d"} Oct 14 14:05:50 crc kubenswrapper[4837]: I1014 14:05:50.446471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-9sjg4" event={"ID":"10a886fb-df51-44e1-88b9-404089da7a25","Type":"ContainerStarted","Data":"1f58e57969c371c51884c7ed95e0db93be76d5a30b2eb9c917783766fc830a7a"} Oct 14 14:05:50 crc kubenswrapper[4837]: I1014 14:05:50.881218 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9bw42/crc-debug-9sjg4"] Oct 14 14:05:50 crc kubenswrapper[4837]: I1014 14:05:50.888306 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9bw42/crc-debug-9sjg4"] Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.574654 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.757462 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfv4h\" (UniqueName: \"kubernetes.io/projected/10a886fb-df51-44e1-88b9-404089da7a25-kube-api-access-pfv4h\") pod \"10a886fb-df51-44e1-88b9-404089da7a25\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.757691 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a886fb-df51-44e1-88b9-404089da7a25-host\") pod \"10a886fb-df51-44e1-88b9-404089da7a25\" (UID: \"10a886fb-df51-44e1-88b9-404089da7a25\") " Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.757742 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10a886fb-df51-44e1-88b9-404089da7a25-host" (OuterVolumeSpecName: "host") pod "10a886fb-df51-44e1-88b9-404089da7a25" (UID: "10a886fb-df51-44e1-88b9-404089da7a25"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.758093 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a886fb-df51-44e1-88b9-404089da7a25-host\") on node \"crc\" DevicePath \"\"" Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.763428 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a886fb-df51-44e1-88b9-404089da7a25-kube-api-access-pfv4h" (OuterVolumeSpecName: "kube-api-access-pfv4h") pod "10a886fb-df51-44e1-88b9-404089da7a25" (UID: "10a886fb-df51-44e1-88b9-404089da7a25"). InnerVolumeSpecName "kube-api-access-pfv4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:05:51 crc kubenswrapper[4837]: I1014 14:05:51.859926 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfv4h\" (UniqueName: \"kubernetes.io/projected/10a886fb-df51-44e1-88b9-404089da7a25-kube-api-access-pfv4h\") on node \"crc\" DevicePath \"\"" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.041536 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9bw42/crc-debug-rdm97"] Oct 14 14:05:52 crc kubenswrapper[4837]: E1014 14:05:52.041898 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a886fb-df51-44e1-88b9-404089da7a25" containerName="container-00" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.041915 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a886fb-df51-44e1-88b9-404089da7a25" containerName="container-00" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.042119 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a886fb-df51-44e1-88b9-404089da7a25" containerName="container-00" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.042715 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.171555 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5c530a8-0a48-48f9-8407-5916493fc82f-host\") pod \"crc-debug-rdm97\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.171685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/d5c530a8-0a48-48f9-8407-5916493fc82f-kube-api-access-q6g7n\") pod \"crc-debug-rdm97\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.273024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5c530a8-0a48-48f9-8407-5916493fc82f-host\") pod \"crc-debug-rdm97\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.273174 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5c530a8-0a48-48f9-8407-5916493fc82f-host\") pod \"crc-debug-rdm97\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.273252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/d5c530a8-0a48-48f9-8407-5916493fc82f-kube-api-access-q6g7n\") pod \"crc-debug-rdm97\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.291574 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/d5c530a8-0a48-48f9-8407-5916493fc82f-kube-api-access-q6g7n\") pod \"crc-debug-rdm97\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.359853 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:52 crc kubenswrapper[4837]: W1014 14:05:52.388888 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5c530a8_0a48_48f9_8407_5916493fc82f.slice/crio-b4966937c6b7cca69a0932a9ca70205272eef8f6293f1c3b92bcbb8f4be12a49 WatchSource:0}: Error finding container b4966937c6b7cca69a0932a9ca70205272eef8f6293f1c3b92bcbb8f4be12a49: Status 404 returned error can't find the container with id b4966937c6b7cca69a0932a9ca70205272eef8f6293f1c3b92bcbb8f4be12a49 Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.464641 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-rdm97" event={"ID":"d5c530a8-0a48-48f9-8407-5916493fc82f","Type":"ContainerStarted","Data":"b4966937c6b7cca69a0932a9ca70205272eef8f6293f1c3b92bcbb8f4be12a49"} Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.466106 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f58e57969c371c51884c7ed95e0db93be76d5a30b2eb9c917783766fc830a7a" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.466144 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-9sjg4" Oct 14 14:05:52 crc kubenswrapper[4837]: I1014 14:05:52.799137 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a886fb-df51-44e1-88b9-404089da7a25" path="/var/lib/kubelet/pods/10a886fb-df51-44e1-88b9-404089da7a25/volumes" Oct 14 14:05:53 crc kubenswrapper[4837]: I1014 14:05:53.475878 4837 generic.go:334] "Generic (PLEG): container finished" podID="d5c530a8-0a48-48f9-8407-5916493fc82f" containerID="4f2d81823dc10524fcc3b4f2782056a65ad4b4ccfb4b5efbd74d1ec8379f3ed0" exitCode=0 Oct 14 14:05:53 crc kubenswrapper[4837]: I1014 14:05:53.475916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/crc-debug-rdm97" event={"ID":"d5c530a8-0a48-48f9-8407-5916493fc82f","Type":"ContainerDied","Data":"4f2d81823dc10524fcc3b4f2782056a65ad4b4ccfb4b5efbd74d1ec8379f3ed0"} Oct 14 14:05:53 crc kubenswrapper[4837]: I1014 14:05:53.522437 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9bw42/crc-debug-rdm97"] Oct 14 14:05:53 crc kubenswrapper[4837]: I1014 14:05:53.531172 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9bw42/crc-debug-rdm97"] Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.605468 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.719964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5c530a8-0a48-48f9-8407-5916493fc82f-host\") pod \"d5c530a8-0a48-48f9-8407-5916493fc82f\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.720065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/d5c530a8-0a48-48f9-8407-5916493fc82f-kube-api-access-q6g7n\") pod \"d5c530a8-0a48-48f9-8407-5916493fc82f\" (UID: \"d5c530a8-0a48-48f9-8407-5916493fc82f\") " Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.721059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c530a8-0a48-48f9-8407-5916493fc82f-host" (OuterVolumeSpecName: "host") pod "d5c530a8-0a48-48f9-8407-5916493fc82f" (UID: "d5c530a8-0a48-48f9-8407-5916493fc82f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.727047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c530a8-0a48-48f9-8407-5916493fc82f-kube-api-access-q6g7n" (OuterVolumeSpecName: "kube-api-access-q6g7n") pod "d5c530a8-0a48-48f9-8407-5916493fc82f" (UID: "d5c530a8-0a48-48f9-8407-5916493fc82f"). InnerVolumeSpecName "kube-api-access-q6g7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.800344 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c530a8-0a48-48f9-8407-5916493fc82f" path="/var/lib/kubelet/pods/d5c530a8-0a48-48f9-8407-5916493fc82f/volumes" Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.822134 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/d5c530a8-0a48-48f9-8407-5916493fc82f-kube-api-access-q6g7n\") on node \"crc\" DevicePath \"\"" Oct 14 14:05:54 crc kubenswrapper[4837]: I1014 14:05:54.822201 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5c530a8-0a48-48f9-8407-5916493fc82f-host\") on node \"crc\" DevicePath \"\"" Oct 14 14:05:55 crc kubenswrapper[4837]: I1014 14:05:55.500087 4837 scope.go:117] "RemoveContainer" containerID="4f2d81823dc10524fcc3b4f2782056a65ad4b4ccfb4b5efbd74d1ec8379f3ed0" Oct 14 14:05:55 crc kubenswrapper[4837]: I1014 14:05:55.500136 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/crc-debug-rdm97" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.137729 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5799b74b9d-p594h_7cb7fa99-fe9e-4e56-a3ef-26c6ad271530/barbican-api/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.278351 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5799b74b9d-p594h_7cb7fa99-fe9e-4e56-a3ef-26c6ad271530/barbican-api-log/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.346499 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5dd7b6957d-hqts4_e52b1001-3fb0-415b-be6a-e55a548462ac/barbican-keystone-listener/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.376412 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5dd7b6957d-hqts4_e52b1001-3fb0-415b-be6a-e55a548462ac/barbican-keystone-listener-log/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.476992 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-64dbf95879-s4jqv_d42a5890-7561-4b99-9518-0c6c672217d9/barbican-worker/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.528429 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-64dbf95879-s4jqv_d42a5890-7561-4b99-9518-0c6c672217d9/barbican-worker-log/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.706290 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xdd9b_6217fcbf-8651-4d63-b670-71de72f5feed/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.750225 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/ceilometer-central-agent/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.906204 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/proxy-httpd/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.906310 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/ceilometer-notification-agent/0.log" Oct 14 14:06:09 crc kubenswrapper[4837]: I1014 14:06:09.975317 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79be4f00-8769-4d3c-aa9c-a1bb24787668/sg-core/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.093130 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_92e412ce-d61d-4c7f-8297-ce2cc5011325/cinder-api-log/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.138100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_92e412ce-d61d-4c7f-8297-ce2cc5011325/cinder-api/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.269369 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_aa0d5e89-66d3-4f22-9704-7c3c35ee537f/cinder-scheduler/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.336703 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_aa0d5e89-66d3-4f22-9704-7c3c35ee537f/probe/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.422695 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-t4vzp_b523b3d5-ba31-4620-8287-055d6bc931cc/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.511720 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b7zcd_b030d75a-71e0-41af-9ab0-298924d1a955/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.637444 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m9npl_1febddb2-b222-433e-b8bc-47a3956bc38d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.750633 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-c696x_4a6f65cd-fd19-4b6a-9dee-4ef117beb86f/init/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.919604 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-c696x_4a6f65cd-fd19-4b6a-9dee-4ef117beb86f/init/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.972010 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-shsmx_714cea27-46ab-4d03-b5d1-81b42d99f6f6/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:10 crc kubenswrapper[4837]: I1014 14:06:10.996294 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-c696x_4a6f65cd-fd19-4b6a-9dee-4ef117beb86f/dnsmasq-dns/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.131633 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a6304802-caa4-4ed2-a570-fc09f7c940b5/glance-log/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.161738 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a6304802-caa4-4ed2-a570-fc09f7c940b5/glance-httpd/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.306573 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d215f9ee-cdfe-47a1-8240-e74f9f81f97d/glance-log/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.340031 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d215f9ee-cdfe-47a1-8240-e74f9f81f97d/glance-httpd/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.484401 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b48ff9644-mb62f_0d3a61c6-2a73-409f-b296-10f7a19685d6/horizon/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.614621 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jtcgd_97b10dd3-253f-47fa-ad50-4765f7139f4f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.799787 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hr9tc_12398715-a536-446f-81aa-00aa7b0546ed/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:11 crc kubenswrapper[4837]: I1014 14:06:11.867482 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b48ff9644-mb62f_0d3a61c6-2a73-409f-b296-10f7a19685d6/horizon-log/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.067362 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7bcd589b8f-ljfsq_9dd1fb1b-4520-43c9-8a24-fd0a225856a3/keystone-api/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.091593 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29340841-2cmzh_9e98c493-7d92-4165-918b-44dace3ca02a/keystone-cron/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.223706 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b20b7196-1920-4b12-a38e-7356ca4dc4e2/kube-state-metrics/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.292627 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-78vdf_aacf282f-f2c7-447d-9e73-98a35898f8df/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.623416 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f7c5db7df-7tsqg_a7d3bc97-ce39-472b-860e-79b620b726f1/neutron-api/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.710101 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f7c5db7df-7tsqg_a7d3bc97-ce39-472b-860e-79b620b726f1/neutron-httpd/0.log" Oct 14 14:06:12 crc kubenswrapper[4837]: I1014 14:06:12.893023 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-78q9h_e0186e2a-7938-4646-ba9c-768d75c09605/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:13 crc kubenswrapper[4837]: I1014 14:06:13.393664 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9a329d7-874c-4b64-b23e-10463d345068/nova-api-log/0.log" Oct 14 14:06:13 crc kubenswrapper[4837]: I1014 14:06:13.430674 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bd40b584-fe33-48fa-a09b-e50f7b40f785/nova-cell0-conductor-conductor/0.log" Oct 14 14:06:13 crc kubenswrapper[4837]: I1014 14:06:13.749958 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c2118184-3d60-4d7a-b203-961341c9be78/nova-cell1-conductor-conductor/0.log" Oct 14 14:06:13 crc kubenswrapper[4837]: I1014 14:06:13.813987 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_abb35cf2-796d-40bc-8b6b-d421dec44645/nova-cell1-novncproxy-novncproxy/0.log" Oct 14 14:06:13 crc kubenswrapper[4837]: I1014 14:06:13.865838 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9a329d7-874c-4b64-b23e-10463d345068/nova-api-api/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.030976 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9v67f_51ebf601-fdd4-46d5-b68e-97846a7baff5/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.142351 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2db56d67-c528-47cc-8569-d9636ebd2667/nova-metadata-log/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.492444 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11ead61d-f315-4ee0-9dcb-a222012a9c36/mysql-bootstrap/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.537213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ebe0b001-1902-4166-a8a3-b3d0c54139f4/nova-scheduler-scheduler/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.736918 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11ead61d-f315-4ee0-9dcb-a222012a9c36/galera/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.760310 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_11ead61d-f315-4ee0-9dcb-a222012a9c36/mysql-bootstrap/0.log" Oct 14 14:06:14 crc kubenswrapper[4837]: I1014 14:06:14.956057 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_162a8777-0979-4087-959a-98cd20678758/mysql-bootstrap/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.102016 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_162a8777-0979-4087-959a-98cd20678758/galera/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.161210 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_162a8777-0979-4087-959a-98cd20678758/mysql-bootstrap/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.300950 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_75fffdca-61c2-4af0-a87d-1662358aa171/openstackclient/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.425792 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-j4tpc_14f970e0-8d42-46d6-937a-c39f521f6bea/ovn-controller/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.505720 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2db56d67-c528-47cc-8569-d9636ebd2667/nova-metadata-metadata/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.622093 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vj9cc_43da7026-edd3-4f7f-9944-1aff537446a0/openstack-network-exporter/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.693817 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovsdb-server-init/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.885484 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovsdb-server-init/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.955782 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovs-vswitchd/0.log" Oct 14 14:06:15 crc kubenswrapper[4837]: I1014 14:06:15.967714 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cp8xg_802ad4c4-c2e5-4c95-88ef-950d8f1fbdf6/ovsdb-server/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.134739 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hc8z6_6f5e2181-b922-48d6-909c-ad1f87fee631/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.267332 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f648709-678d-4844-8571-ac5c5c5712a3/ovn-northd/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.279378 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f648709-678d-4844-8571-ac5c5c5712a3/openstack-network-exporter/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.449621 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cc544b19-4b52-46ca-9c0b-518f78ebb47b/openstack-network-exporter/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.547916 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cc544b19-4b52-46ca-9c0b-518f78ebb47b/ovsdbserver-nb/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.640667 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7f5a204e-7b4c-41c2-8d69-e93d3c986249/openstack-network-exporter/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.644254 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7f5a204e-7b4c-41c2-8d69-e93d3c986249/ovsdbserver-sb/0.log" Oct 14 14:06:16 crc kubenswrapper[4837]: I1014 14:06:16.825181 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67c99f9644-lpk76_3be94ea9-34d4-4765-92cf-93345cfb88bb/placement-api/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.015878 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7edbbd-c98f-4800-a4ae-49ea0de7f12d/setup-container/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.048244 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67c99f9644-lpk76_3be94ea9-34d4-4765-92cf-93345cfb88bb/placement-log/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.423761 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7edbbd-c98f-4800-a4ae-49ea0de7f12d/rabbitmq/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.479940 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7edbbd-c98f-4800-a4ae-49ea0de7f12d/setup-container/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.503606 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4e21425-fc2a-487e-bb81-615828fd727f/setup-container/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.672545 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4e21425-fc2a-487e-bb81-615828fd727f/rabbitmq/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.742450 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c4e21425-fc2a-487e-bb81-615828fd727f/setup-container/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.768973 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mrkcj_611b04f3-d9fa-4841-8cd5-608c99279890/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:17 crc kubenswrapper[4837]: I1014 14:06:17.999741 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ggmtq_d5e9e4de-2bda-45cb-a580-b89e8dee024e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.021623 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-88jtv_8fbd0386-a3fe-4ad1-8b44-0945dd47a255/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.172781 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2nnnt_ddd3587a-7e10-4ad8-90bf-c172acc6e635/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.386527 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m8c7v_805056c6-9ce3-4dcf-852d-2a71b8627f80/ssh-known-hosts-edpm-deployment/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.723603 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-68b7b9db59-mdpgm_7ac8f443-1071-49d6-94d2-e7fea6f09cc5/proxy-server/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.763831 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-zww9d_1960b9c9-0169-447b-a184-21c3522760f8/swift-ring-rebalance/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.794893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-68b7b9db59-mdpgm_7ac8f443-1071-49d6-94d2-e7fea6f09cc5/proxy-httpd/0.log" Oct 14 14:06:18 crc kubenswrapper[4837]: I1014 14:06:18.929027 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-auditor/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.008525 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-reaper/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.026752 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-replicator/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.147157 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-auditor/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.153702 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/account-server/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.267174 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-server/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.269638 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-replicator/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.365295 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/container-updater/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.435870 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-auditor/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.476095 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-replicator/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.496661 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-expirer/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.577035 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-server/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.661269 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/swift-recon-cron/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.662716 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/object-updater/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.751398 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d34918e7-1e17-4d1d-a163-4d2f0539f2d7/rsync/0.log" Oct 14 14:06:19 crc kubenswrapper[4837]: I1014 14:06:19.918028 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5wbtn_aa5f8d90-d124-49cd-ac34-f24b91f0a457/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:20 crc kubenswrapper[4837]: I1014 14:06:20.067892 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_beac4f98-00d6-438b-86cc-2f85d2ca1f96/tempest-tests-tempest-tests-runner/0.log" Oct 14 14:06:20 crc kubenswrapper[4837]: I1014 14:06:20.111122 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5726ac57-50cd-4985-a6cd-86a9683bb283/test-operator-logs-container/0.log" Oct 14 14:06:20 crc kubenswrapper[4837]: I1014 14:06:20.299019 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-qxb82_6fa36834-4501-43b2-8084-2c79052f5185/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 14 14:06:30 crc kubenswrapper[4837]: I1014 14:06:30.790832 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0454524b-c83b-4049-ad05-8b29a317bc91/memcached/0.log" Oct 14 14:06:42 crc kubenswrapper[4837]: I1014 14:06:42.925965 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/util/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.077360 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/util/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.097756 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/pull/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.142201 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/pull/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.249912 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/util/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.290220 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/pull/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.302217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4a94fee3d89c1a1ab3bc183409878c9c9f43f00d9425b5e65d4d7ec9spmwk_6d53462f-684e-4f9b-91dc-c9b7e9edf8aa/extract/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.440738 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-49s6c_e1d5f52e-4c67-4242-bea3-6eef9fb72623/kube-rbac-proxy/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.512448 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-49s6c_e1d5f52e-4c67-4242-bea3-6eef9fb72623/manager/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.557019 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5f627_53370b8e-db35-4a50-af38-f24ac2fad459/kube-rbac-proxy/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.677141 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5f627_53370b8e-db35-4a50-af38-f24ac2fad459/manager/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.688325 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-jczn9_37e6419b-1647-43e2-89ef-67deae94e8b3/kube-rbac-proxy/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.745918 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-jczn9_37e6419b-1647-43e2-89ef-67deae94e8b3/manager/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.851786 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-q9dmc_f915ddfd-5160-4f57-85a8-9b5fe02c1908/kube-rbac-proxy/0.log" Oct 14 14:06:43 crc kubenswrapper[4837]: I1014 14:06:43.948715 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-q9dmc_f915ddfd-5160-4f57-85a8-9b5fe02c1908/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.042007 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qkd6g_cfce54d9-39e9-4b1f-bb95-11d72de2cbdc/kube-rbac-proxy/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.042519 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qkd6g_cfce54d9-39e9-4b1f-bb95-11d72de2cbdc/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.146634 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-9hctf_f7815a82-8a77-47a1-8a07-966eb6340b2b/kube-rbac-proxy/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.252546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-9hctf_f7815a82-8a77-47a1-8a07-966eb6340b2b/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.332632 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-7xv4c_62e66325-7f63-4815-9f2d-fafbd138fa4e/kube-rbac-proxy/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.485371 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-7xv4c_62e66325-7f63-4815-9f2d-fafbd138fa4e/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.533494 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-n6pcv_32cb1840-83d3-40ec-859a-15391e369bde/kube-rbac-proxy/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.593138 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-n6pcv_32cb1840-83d3-40ec-859a-15391e369bde/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.707420 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-fq8p9_4f4fbd70-1ccf-4509-8552-ab902e8e7a0f/kube-rbac-proxy/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.741707 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-fq8p9_4f4fbd70-1ccf-4509-8552-ab902e8e7a0f/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.849073 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-87zsz_62be7f3d-ddbe-4470-ace0-0907330b09ac/kube-rbac-proxy/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.890781 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-87zsz_62be7f3d-ddbe-4470-ace0-0907330b09ac/manager/0.log" Oct 14 14:06:44 crc kubenswrapper[4837]: I1014 14:06:44.987053 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-mdzbv_e4f5b829-46e0-4048-9b51-1a9256375d4f/kube-rbac-proxy/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.072419 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-mdzbv_e4f5b829-46e0-4048-9b51-1a9256375d4f/manager/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.128135 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-b2hkp_7f62a453-6fb4-4769-a2ef-da03024d8e90/kube-rbac-proxy/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.248215 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-b2hkp_7f62a453-6fb4-4769-a2ef-da03024d8e90/manager/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.294930 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-hx2m5_4d12bc33-de6d-405c-b539-72ab956b4234/kube-rbac-proxy/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.382586 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-hx2m5_4d12bc33-de6d-405c-b539-72ab956b4234/manager/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.504787 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-6w7km_fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e/kube-rbac-proxy/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.508494 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-6w7km_fc3acb5d-8e6d-4c7a-9f6d-e59e87d6213e/manager/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.628859 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b_922d6301-937e-403a-ade6-06620798c61c/kube-rbac-proxy/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.645085 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dlj75b_922d6301-937e-403a-ade6-06620798c61c/manager/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.775123 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-84c49f8869-sxmsq_293d5905-c149-4fe1-a09d-204cc4cff4e6/kube-rbac-proxy/0.log" Oct 14 14:06:45 crc kubenswrapper[4837]: I1014 14:06:45.907663 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f9b497985-b8x95_6092738b-995d-48bd-a9a7-0c5b4caebea9/kube-rbac-proxy/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.116044 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6f9b497985-b8x95_6092738b-995d-48bd-a9a7-0c5b4caebea9/operator/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.141143 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qq6lf_aafb3bab-e32a-4523-8b72-b3131408a0be/registry-server/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.444561 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-mlhgx_1ac92ea3-d385-42f1-bc27-59a93f495cbc/manager/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.471791 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-mlhgx_1ac92ea3-d385-42f1-bc27-59a93f495cbc/kube-rbac-proxy/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.672735 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-jkjjd_a086b7d2-5401-4754-9825-2425a3a2aa22/kube-rbac-proxy/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.784860 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-jkjjd_a086b7d2-5401-4754-9825-2425a3a2aa22/manager/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.843921 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-jl7vs_be28404e-866e-4ffd-8cfc-a43090217244/operator/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.909324 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-84c49f8869-sxmsq_293d5905-c149-4fe1-a09d-204cc4cff4e6/manager/0.log" Oct 14 14:06:46 crc kubenswrapper[4837]: I1014 14:06:46.955046 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-t8t4r_640618dc-c509-410b-9669-9b77a1f8d068/kube-rbac-proxy/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.020386 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-t8t4r_640618dc-c509-410b-9669-9b77a1f8d068/manager/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.115821 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-78pkj_7d205182-3314-4282-800d-4dc57b64f416/kube-rbac-proxy/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.181431 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-78pkj_7d205182-3314-4282-800d-4dc57b64f416/manager/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.221896 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mhs2q_c2182e6f-c24c-4164-a269-4c11d34057a7/kube-rbac-proxy/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.264490 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-mhs2q_c2182e6f-c24c-4164-a269-4c11d34057a7/manager/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.347373 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-rfdvw_6017e7af-9d95-42c3-9f9c-bbd3df49f4f4/kube-rbac-proxy/0.log" Oct 14 14:06:47 crc kubenswrapper[4837]: I1014 14:06:47.374020 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-rfdvw_6017e7af-9d95-42c3-9f9c-bbd3df49f4f4/manager/0.log" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.151601 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cscml"] Oct 14 14:06:48 crc kubenswrapper[4837]: E1014 14:06:48.152035 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c530a8-0a48-48f9-8407-5916493fc82f" containerName="container-00" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.152050 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c530a8-0a48-48f9-8407-5916493fc82f" containerName="container-00" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.152271 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c530a8-0a48-48f9-8407-5916493fc82f" containerName="container-00" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.153700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.185359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cscml"] Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.267604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqp52\" (UniqueName: \"kubernetes.io/projected/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-kube-api-access-wqp52\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.267677 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-utilities\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.268018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-catalog-content\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.369754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-catalog-content\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.370004 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqp52\" (UniqueName: \"kubernetes.io/projected/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-kube-api-access-wqp52\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.370082 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-utilities\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.370291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-catalog-content\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.370598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-utilities\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.388498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqp52\" (UniqueName: \"kubernetes.io/projected/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-kube-api-access-wqp52\") pod \"redhat-marketplace-cscml\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.470295 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.940717 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cscml"] Oct 14 14:06:48 crc kubenswrapper[4837]: I1014 14:06:48.977931 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cscml" event={"ID":"b4885a19-ee05-4c8c-9416-f9b7a0ccb194","Type":"ContainerStarted","Data":"96a92d042936763a30e74b1ef93c20a3a7676a40e527b5492df3afedaf1ac2cc"} Oct 14 14:06:49 crc kubenswrapper[4837]: I1014 14:06:49.986325 4837 generic.go:334] "Generic (PLEG): container finished" podID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerID="c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58" exitCode=0 Oct 14 14:06:49 crc kubenswrapper[4837]: I1014 14:06:49.986532 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cscml" event={"ID":"b4885a19-ee05-4c8c-9416-f9b7a0ccb194","Type":"ContainerDied","Data":"c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58"} Oct 14 14:06:49 crc kubenswrapper[4837]: I1014 14:06:49.988427 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 14:06:52 crc kubenswrapper[4837]: I1014 14:06:52.003995 4837 generic.go:334] "Generic (PLEG): container finished" podID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerID="d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8" exitCode=0 Oct 14 14:06:52 crc kubenswrapper[4837]: I1014 14:06:52.004049 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cscml" event={"ID":"b4885a19-ee05-4c8c-9416-f9b7a0ccb194","Type":"ContainerDied","Data":"d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8"} Oct 14 14:06:53 crc kubenswrapper[4837]: I1014 14:06:53.018825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cscml" event={"ID":"b4885a19-ee05-4c8c-9416-f9b7a0ccb194","Type":"ContainerStarted","Data":"af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc"} Oct 14 14:06:53 crc kubenswrapper[4837]: I1014 14:06:53.040668 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cscml" podStartSLOduration=2.508876438 podStartE2EDuration="5.040652598s" podCreationTimestamp="2025-10-14 14:06:48 +0000 UTC" firstStartedPulling="2025-10-14 14:06:49.988224571 +0000 UTC m=+3947.905224384" lastFinishedPulling="2025-10-14 14:06:52.520000731 +0000 UTC m=+3950.437000544" observedRunningTime="2025-10-14 14:06:53.036355591 +0000 UTC m=+3950.953355394" watchObservedRunningTime="2025-10-14 14:06:53.040652598 +0000 UTC m=+3950.957652411" Oct 14 14:06:58 crc kubenswrapper[4837]: I1014 14:06:58.471213 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:58 crc kubenswrapper[4837]: I1014 14:06:58.471762 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:58 crc kubenswrapper[4837]: I1014 14:06:58.526901 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:59 crc kubenswrapper[4837]: I1014 14:06:59.146663 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:06:59 crc kubenswrapper[4837]: I1014 14:06:59.204095 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cscml"] Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.107239 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cscml" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="registry-server" containerID="cri-o://af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc" gracePeriod=2 Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.522705 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.646771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-catalog-content\") pod \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.646835 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqp52\" (UniqueName: \"kubernetes.io/projected/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-kube-api-access-wqp52\") pod \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.647031 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-utilities\") pod \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\" (UID: \"b4885a19-ee05-4c8c-9416-f9b7a0ccb194\") " Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.648021 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-utilities" (OuterVolumeSpecName: "utilities") pod "b4885a19-ee05-4c8c-9416-f9b7a0ccb194" (UID: "b4885a19-ee05-4c8c-9416-f9b7a0ccb194"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.655235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-kube-api-access-wqp52" (OuterVolumeSpecName: "kube-api-access-wqp52") pod "b4885a19-ee05-4c8c-9416-f9b7a0ccb194" (UID: "b4885a19-ee05-4c8c-9416-f9b7a0ccb194"). InnerVolumeSpecName "kube-api-access-wqp52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.661371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4885a19-ee05-4c8c-9416-f9b7a0ccb194" (UID: "b4885a19-ee05-4c8c-9416-f9b7a0ccb194"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.749509 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.749571 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqp52\" (UniqueName: \"kubernetes.io/projected/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-kube-api-access-wqp52\") on node \"crc\" DevicePath \"\"" Oct 14 14:07:01 crc kubenswrapper[4837]: I1014 14:07:01.749594 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4885a19-ee05-4c8c-9416-f9b7a0ccb194-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.117992 4837 generic.go:334] "Generic (PLEG): container finished" podID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerID="af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc" exitCode=0 Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.118090 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cscml" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.118144 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cscml" event={"ID":"b4885a19-ee05-4c8c-9416-f9b7a0ccb194","Type":"ContainerDied","Data":"af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc"} Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.118502 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cscml" event={"ID":"b4885a19-ee05-4c8c-9416-f9b7a0ccb194","Type":"ContainerDied","Data":"96a92d042936763a30e74b1ef93c20a3a7676a40e527b5492df3afedaf1ac2cc"} Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.118537 4837 scope.go:117] "RemoveContainer" containerID="af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.137934 4837 scope.go:117] "RemoveContainer" containerID="d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.165394 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cscml"] Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.170221 4837 scope.go:117] "RemoveContainer" containerID="c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.177898 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cscml"] Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.221947 4837 scope.go:117] "RemoveContainer" containerID="af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc" Oct 14 14:07:02 crc kubenswrapper[4837]: E1014 14:07:02.222775 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc\": container with ID starting with af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc not found: ID does not exist" containerID="af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.222833 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc"} err="failed to get container status \"af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc\": rpc error: code = NotFound desc = could not find container \"af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc\": container with ID starting with af42d7355a8ebbb532952133098263555c40d5938f4a46c4052f46b9aea72abc not found: ID does not exist" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.222859 4837 scope.go:117] "RemoveContainer" containerID="d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8" Oct 14 14:07:02 crc kubenswrapper[4837]: E1014 14:07:02.223426 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8\": container with ID starting with d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8 not found: ID does not exist" containerID="d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.223457 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8"} err="failed to get container status \"d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8\": rpc error: code = NotFound desc = could not find container \"d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8\": container with ID starting with d3d4a788e7e88a01f7d64d9ef3cc9b48c431a9946e61d537c53b93bd0c11fec8 not found: ID does not exist" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.223490 4837 scope.go:117] "RemoveContainer" containerID="c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58" Oct 14 14:07:02 crc kubenswrapper[4837]: E1014 14:07:02.223847 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58\": container with ID starting with c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58 not found: ID does not exist" containerID="c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.223871 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58"} err="failed to get container status \"c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58\": rpc error: code = NotFound desc = could not find container \"c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58\": container with ID starting with c29204ed029bca2f06c29d510b6058317af4b4dc63d01750d994d7a151ee1f58 not found: ID does not exist" Oct 14 14:07:02 crc kubenswrapper[4837]: I1014 14:07:02.814079 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" path="/var/lib/kubelet/pods/b4885a19-ee05-4c8c-9416-f9b7a0ccb194/volumes" Oct 14 14:07:04 crc kubenswrapper[4837]: I1014 14:07:04.407378 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wzp95_1577b547-7e30-4b8e-9959-fdd88088041c/control-plane-machine-set-operator/0.log" Oct 14 14:07:04 crc kubenswrapper[4837]: I1014 14:07:04.582009 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dfjx8_675483c3-eb80-41b4-b02b-db9059ec788b/kube-rbac-proxy/0.log" Oct 14 14:07:04 crc kubenswrapper[4837]: I1014 14:07:04.601594 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dfjx8_675483c3-eb80-41b4-b02b-db9059ec788b/machine-api-operator/0.log" Oct 14 14:07:17 crc kubenswrapper[4837]: I1014 14:07:17.158179 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-65869_f782d810-8b08-4a07-b024-0481a26cf944/cert-manager-controller/0.log" Oct 14 14:07:17 crc kubenswrapper[4837]: I1014 14:07:17.327169 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-nqrzh_ffee16ce-49f5-418a-b83c-64b60165f84e/cert-manager-cainjector/0.log" Oct 14 14:07:17 crc kubenswrapper[4837]: I1014 14:07:17.366370 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zktdz_ca647993-67e2-4c73-b529-68deed403e7f/cert-manager-webhook/0.log" Oct 14 14:07:29 crc kubenswrapper[4837]: I1014 14:07:29.462320 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-5jz9k_8e30fca9-8930-4438-baeb-6cd8437d808e/nmstate-console-plugin/0.log" Oct 14 14:07:29 crc kubenswrapper[4837]: I1014 14:07:29.616754 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zpg5h_f596383f-8fd9-42cc-9554-8cfac0f1cbeb/nmstate-handler/0.log" Oct 14 14:07:29 crc kubenswrapper[4837]: I1014 14:07:29.666472 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2fxhs_fe630318-04d6-4ba7-98d4-004f61f9e801/kube-rbac-proxy/0.log" Oct 14 14:07:29 crc kubenswrapper[4837]: I1014 14:07:29.719896 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2fxhs_fe630318-04d6-4ba7-98d4-004f61f9e801/nmstate-metrics/0.log" Oct 14 14:07:29 crc kubenswrapper[4837]: I1014 14:07:29.832866 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-qmtct_a0b26320-e880-47dc-8ead-5b4547870db1/nmstate-operator/0.log" Oct 14 14:07:29 crc kubenswrapper[4837]: I1014 14:07:29.912015 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-dktjp_2e3f42bf-7e0b-4969-8b2e-0479072f35a4/nmstate-webhook/0.log" Oct 14 14:07:43 crc kubenswrapper[4837]: I1014 14:07:43.950210 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-9c8q6_f744e2d8-9bff-4348-8014-42a4a7a5cc20/kube-rbac-proxy/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.100835 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-9c8q6_f744e2d8-9bff-4348-8014-42a4a7a5cc20/controller/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.127988 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.341145 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.342900 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.348218 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.355228 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.523191 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.544979 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.583483 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.609322 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.761384 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-reloader/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.764840 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-metrics/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.827225 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/cp-frr-files/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.848937 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/controller/0.log" Oct 14 14:07:44 crc kubenswrapper[4837]: I1014 14:07:44.978605 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/frr-metrics/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.065970 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/kube-rbac-proxy/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.088372 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/kube-rbac-proxy-frr/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.207446 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/reloader/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.315869 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-hvtxf_529d2022-65d4-49b1-801d-f14d900cfdf7/frr-k8s-webhook-server/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.608029 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bb79b9dd7-l248c_0c827589-1da4-40cd-967d-4144c014cee8/manager/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.705280 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5788b958cf-vqdk2_58b3aa1b-4eaa-4b13-a503-a9789cfbe7c5/webhook-server/0.log" Oct 14 14:07:45 crc kubenswrapper[4837]: I1014 14:07:45.812889 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tlqsk_f5bb08ae-810b-4b13-a2aa-6ff68721a5a3/kube-rbac-proxy/0.log" Oct 14 14:07:46 crc kubenswrapper[4837]: I1014 14:07:46.363427 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tlqsk_f5bb08ae-810b-4b13-a2aa-6ff68721a5a3/speaker/0.log" Oct 14 14:07:46 crc kubenswrapper[4837]: I1014 14:07:46.511315 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cxg7q_5598bd38-632b-4225-8064-3352c0dac0de/frr/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.584847 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/util/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.731866 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/util/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.756024 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/pull/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.814669 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/pull/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.982576 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/extract/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.988717 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/util/0.log" Oct 14 14:07:57 crc kubenswrapper[4837]: I1014 14:07:57.989327 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lv7bw_f1f2ea4b-2083-4b4b-8547-ed6e783b1ec7/pull/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.278432 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-utilities/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.431456 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-utilities/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.436314 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-content/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.445339 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-content/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.593132 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-utilities/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.626205 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/extract-content/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.831692 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-utilities/0.log" Oct 14 14:07:58 crc kubenswrapper[4837]: I1014 14:07:58.996336 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-utilities/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.041724 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-content/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.068210 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-content/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.261750 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rbmpr_28f027db-f79f-4187-82df-c6d63c37ffce/registry-server/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.299647 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-content/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.315774 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/extract-utilities/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.542384 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/util/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.713489 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/util/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.754032 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/pull/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.817888 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hf72l_5484dc7a-db10-484c-94e5-faae9179b8bc/registry-server/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.845526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/pull/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.971572 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/util/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.979466 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/pull/0.log" Oct 14 14:07:59 crc kubenswrapper[4837]: I1014 14:07:59.986910 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwfqtv_306a4a1e-e6b7-4efb-aeba-2b570be7a5e6/extract/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.127670 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-t8tjh_cde48aeb-8f47-4fde-a2cb-a95c09051e43/marketplace-operator/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.201147 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-utilities/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.320373 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-utilities/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.363530 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-content/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.368462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-content/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.535654 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-utilities/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.589742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/extract-content/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.747308 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-utilities/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.761788 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b2ll4_61e9a58b-2860-421e-b616-4ca4234a1e24/registry-server/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.948303 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-utilities/0.log" Oct 14 14:08:00 crc kubenswrapper[4837]: I1014 14:08:00.951596 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-content/0.log" Oct 14 14:08:01 crc kubenswrapper[4837]: I1014 14:08:01.026767 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-content/0.log" Oct 14 14:08:01 crc kubenswrapper[4837]: I1014 14:08:01.396927 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-utilities/0.log" Oct 14 14:08:01 crc kubenswrapper[4837]: I1014 14:08:01.486958 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/extract-content/0.log" Oct 14 14:08:01 crc kubenswrapper[4837]: I1014 14:08:01.902834 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bvw9_3bd69629-bfc4-405d-8764-a2082b5c8449/registry-server/0.log" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.386873 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qpnn5"] Oct 14 14:08:07 crc kubenswrapper[4837]: E1014 14:08:07.387877 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="extract-content" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.387892 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="extract-content" Oct 14 14:08:07 crc kubenswrapper[4837]: E1014 14:08:07.387930 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="extract-utilities" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.387937 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="extract-utilities" Oct 14 14:08:07 crc kubenswrapper[4837]: E1014 14:08:07.387970 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="registry-server" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.387991 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="registry-server" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.388230 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4885a19-ee05-4c8c-9416-f9b7a0ccb194" containerName="registry-server" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.389813 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.408050 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpnn5"] Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.469936 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzxrg\" (UniqueName: \"kubernetes.io/projected/2ee3aede-6190-44f4-9334-c5ea92f3104f-kube-api-access-qzxrg\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.470024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-catalog-content\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.470426 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-utilities\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.572629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-catalog-content\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.572807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-utilities\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.572877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzxrg\" (UniqueName: \"kubernetes.io/projected/2ee3aede-6190-44f4-9334-c5ea92f3104f-kube-api-access-qzxrg\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.573262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-utilities\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.573283 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-catalog-content\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.592859 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzxrg\" (UniqueName: \"kubernetes.io/projected/2ee3aede-6190-44f4-9334-c5ea92f3104f-kube-api-access-qzxrg\") pod \"certified-operators-qpnn5\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:07 crc kubenswrapper[4837]: I1014 14:08:07.710671 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:08 crc kubenswrapper[4837]: I1014 14:08:08.202474 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpnn5"] Oct 14 14:08:08 crc kubenswrapper[4837]: I1014 14:08:08.675783 4837 generic.go:334] "Generic (PLEG): container finished" podID="2ee3aede-6190-44f4-9334-c5ea92f3104f" containerID="e0ac1513d9c601a5ff99a73efc0934eabb1039834073cc4adc4b9aa214ab3c5a" exitCode=0 Oct 14 14:08:08 crc kubenswrapper[4837]: I1014 14:08:08.675890 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerDied","Data":"e0ac1513d9c601a5ff99a73efc0934eabb1039834073cc4adc4b9aa214ab3c5a"} Oct 14 14:08:08 crc kubenswrapper[4837]: I1014 14:08:08.676142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerStarted","Data":"c251e1e88b7397a99442c20c95746a808c0b8c231d8dcf7d586dedf93209679d"} Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.595245 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6jss"] Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.597890 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.612836 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6jss"] Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.688592 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerStarted","Data":"1b7134ca89d97a37b584cc52f34e0ddfd560d78c724236da6d4aa6b7ac43568a"} Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.719465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5q4\" (UniqueName: \"kubernetes.io/projected/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-kube-api-access-mr5q4\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.719716 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-catalog-content\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.719752 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-utilities\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.822199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-catalog-content\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.822252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-utilities\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.822286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5q4\" (UniqueName: \"kubernetes.io/projected/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-kube-api-access-mr5q4\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.823457 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-utilities\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.823683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-catalog-content\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.844644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5q4\" (UniqueName: \"kubernetes.io/projected/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-kube-api-access-mr5q4\") pod \"community-operators-k6jss\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:09 crc kubenswrapper[4837]: I1014 14:08:09.927550 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:10 crc kubenswrapper[4837]: I1014 14:08:10.465312 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6jss"] Oct 14 14:08:10 crc kubenswrapper[4837]: I1014 14:08:10.703671 4837 generic.go:334] "Generic (PLEG): container finished" podID="2ee3aede-6190-44f4-9334-c5ea92f3104f" containerID="1b7134ca89d97a37b584cc52f34e0ddfd560d78c724236da6d4aa6b7ac43568a" exitCode=0 Oct 14 14:08:10 crc kubenswrapper[4837]: I1014 14:08:10.703759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerDied","Data":"1b7134ca89d97a37b584cc52f34e0ddfd560d78c724236da6d4aa6b7ac43568a"} Oct 14 14:08:10 crc kubenswrapper[4837]: I1014 14:08:10.708806 4837 generic.go:334] "Generic (PLEG): container finished" podID="5617036b-4f3a-42c4-b5a0-cbff2a2e0862" containerID="d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586" exitCode=0 Oct 14 14:08:10 crc kubenswrapper[4837]: I1014 14:08:10.708858 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6jss" event={"ID":"5617036b-4f3a-42c4-b5a0-cbff2a2e0862","Type":"ContainerDied","Data":"d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586"} Oct 14 14:08:10 crc kubenswrapper[4837]: I1014 14:08:10.708894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6jss" event={"ID":"5617036b-4f3a-42c4-b5a0-cbff2a2e0862","Type":"ContainerStarted","Data":"aedc8acc53ab51a1729713269b267692ed94aa9efd01a7c2fd6799dd71e5856d"} Oct 14 14:08:11 crc kubenswrapper[4837]: I1014 14:08:11.139844 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 14:08:11 crc kubenswrapper[4837]: I1014 14:08:11.139918 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 14:08:12 crc kubenswrapper[4837]: I1014 14:08:12.734974 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerStarted","Data":"749f17cdd9ed9a54fc5ffa95895288c399e4999cf32b1dc969ef3ac0bd07abe6"} Oct 14 14:08:12 crc kubenswrapper[4837]: I1014 14:08:12.736821 4837 generic.go:334] "Generic (PLEG): container finished" podID="5617036b-4f3a-42c4-b5a0-cbff2a2e0862" containerID="263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0" exitCode=0 Oct 14 14:08:12 crc kubenswrapper[4837]: I1014 14:08:12.736855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6jss" event={"ID":"5617036b-4f3a-42c4-b5a0-cbff2a2e0862","Type":"ContainerDied","Data":"263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0"} Oct 14 14:08:12 crc kubenswrapper[4837]: I1014 14:08:12.764662 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qpnn5" podStartSLOduration=2.970134426 podStartE2EDuration="5.764638763s" podCreationTimestamp="2025-10-14 14:08:07 +0000 UTC" firstStartedPulling="2025-10-14 14:08:08.67860889 +0000 UTC m=+4026.595608713" lastFinishedPulling="2025-10-14 14:08:11.473113197 +0000 UTC m=+4029.390113050" observedRunningTime="2025-10-14 14:08:12.755773752 +0000 UTC m=+4030.672773575" watchObservedRunningTime="2025-10-14 14:08:12.764638763 +0000 UTC m=+4030.681638576" Oct 14 14:08:13 crc kubenswrapper[4837]: I1014 14:08:13.748734 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6jss" event={"ID":"5617036b-4f3a-42c4-b5a0-cbff2a2e0862","Type":"ContainerStarted","Data":"497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c"} Oct 14 14:08:13 crc kubenswrapper[4837]: I1014 14:08:13.769198 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6jss" podStartSLOduration=2.162418329 podStartE2EDuration="4.769181691s" podCreationTimestamp="2025-10-14 14:08:09 +0000 UTC" firstStartedPulling="2025-10-14 14:08:10.710485974 +0000 UTC m=+4028.627485807" lastFinishedPulling="2025-10-14 14:08:13.317249356 +0000 UTC m=+4031.234249169" observedRunningTime="2025-10-14 14:08:13.768495832 +0000 UTC m=+4031.685495645" watchObservedRunningTime="2025-10-14 14:08:13.769181691 +0000 UTC m=+4031.686181504" Oct 14 14:08:17 crc kubenswrapper[4837]: I1014 14:08:17.711538 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:17 crc kubenswrapper[4837]: I1014 14:08:17.712073 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:17 crc kubenswrapper[4837]: I1014 14:08:17.758845 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:17 crc kubenswrapper[4837]: I1014 14:08:17.858778 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:18 crc kubenswrapper[4837]: I1014 14:08:18.382236 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpnn5"] Oct 14 14:08:19 crc kubenswrapper[4837]: I1014 14:08:19.809626 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qpnn5" podUID="2ee3aede-6190-44f4-9334-c5ea92f3104f" containerName="registry-server" containerID="cri-o://749f17cdd9ed9a54fc5ffa95895288c399e4999cf32b1dc969ef3ac0bd07abe6" gracePeriod=2 Oct 14 14:08:19 crc kubenswrapper[4837]: I1014 14:08:19.928506 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:19 crc kubenswrapper[4837]: I1014 14:08:19.928585 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:20 crc kubenswrapper[4837]: I1014 14:08:20.008007 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:20 crc kubenswrapper[4837]: I1014 14:08:20.826841 4837 generic.go:334] "Generic (PLEG): container finished" podID="2ee3aede-6190-44f4-9334-c5ea92f3104f" containerID="749f17cdd9ed9a54fc5ffa95895288c399e4999cf32b1dc969ef3ac0bd07abe6" exitCode=0 Oct 14 14:08:20 crc kubenswrapper[4837]: I1014 14:08:20.826916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerDied","Data":"749f17cdd9ed9a54fc5ffa95895288c399e4999cf32b1dc969ef3ac0bd07abe6"} Oct 14 14:08:20 crc kubenswrapper[4837]: I1014 14:08:20.827195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpnn5" event={"ID":"2ee3aede-6190-44f4-9334-c5ea92f3104f","Type":"ContainerDied","Data":"c251e1e88b7397a99442c20c95746a808c0b8c231d8dcf7d586dedf93209679d"} Oct 14 14:08:20 crc kubenswrapper[4837]: I1014 14:08:20.827211 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c251e1e88b7397a99442c20c95746a808c0b8c231d8dcf7d586dedf93209679d" Oct 14 14:08:20 crc kubenswrapper[4837]: I1014 14:08:20.947957 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.004423 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.040283 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-utilities\") pod \"2ee3aede-6190-44f4-9334-c5ea92f3104f\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.040391 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzxrg\" (UniqueName: \"kubernetes.io/projected/2ee3aede-6190-44f4-9334-c5ea92f3104f-kube-api-access-qzxrg\") pod \"2ee3aede-6190-44f4-9334-c5ea92f3104f\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.040516 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-catalog-content\") pod \"2ee3aede-6190-44f4-9334-c5ea92f3104f\" (UID: \"2ee3aede-6190-44f4-9334-c5ea92f3104f\") " Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.040850 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-utilities" (OuterVolumeSpecName: "utilities") pod "2ee3aede-6190-44f4-9334-c5ea92f3104f" (UID: "2ee3aede-6190-44f4-9334-c5ea92f3104f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.040990 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.048441 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee3aede-6190-44f4-9334-c5ea92f3104f-kube-api-access-qzxrg" (OuterVolumeSpecName: "kube-api-access-qzxrg") pod "2ee3aede-6190-44f4-9334-c5ea92f3104f" (UID: "2ee3aede-6190-44f4-9334-c5ea92f3104f"). InnerVolumeSpecName "kube-api-access-qzxrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.107601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ee3aede-6190-44f4-9334-c5ea92f3104f" (UID: "2ee3aede-6190-44f4-9334-c5ea92f3104f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.143277 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzxrg\" (UniqueName: \"kubernetes.io/projected/2ee3aede-6190-44f4-9334-c5ea92f3104f-kube-api-access-qzxrg\") on node \"crc\" DevicePath \"\"" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.143311 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee3aede-6190-44f4-9334-c5ea92f3104f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.835952 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpnn5" Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.880759 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpnn5"] Oct 14 14:08:21 crc kubenswrapper[4837]: I1014 14:08:21.911123 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qpnn5"] Oct 14 14:08:22 crc kubenswrapper[4837]: I1014 14:08:22.377177 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6jss"] Oct 14 14:08:22 crc kubenswrapper[4837]: I1014 14:08:22.805221 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee3aede-6190-44f4-9334-c5ea92f3104f" path="/var/lib/kubelet/pods/2ee3aede-6190-44f4-9334-c5ea92f3104f/volumes" Oct 14 14:08:22 crc kubenswrapper[4837]: I1014 14:08:22.845956 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6jss" podUID="5617036b-4f3a-42c4-b5a0-cbff2a2e0862" containerName="registry-server" containerID="cri-o://497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c" gracePeriod=2 Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.418592 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.489364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-catalog-content\") pod \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.489555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr5q4\" (UniqueName: \"kubernetes.io/projected/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-kube-api-access-mr5q4\") pod \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.489575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-utilities\") pod \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\" (UID: \"5617036b-4f3a-42c4-b5a0-cbff2a2e0862\") " Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.490258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-utilities" (OuterVolumeSpecName: "utilities") pod "5617036b-4f3a-42c4-b5a0-cbff2a2e0862" (UID: "5617036b-4f3a-42c4-b5a0-cbff2a2e0862"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.490662 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-utilities\") on node \"crc\" DevicePath \"\"" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.500265 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-kube-api-access-mr5q4" (OuterVolumeSpecName: "kube-api-access-mr5q4") pod "5617036b-4f3a-42c4-b5a0-cbff2a2e0862" (UID: "5617036b-4f3a-42c4-b5a0-cbff2a2e0862"). InnerVolumeSpecName "kube-api-access-mr5q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.559821 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5617036b-4f3a-42c4-b5a0-cbff2a2e0862" (UID: "5617036b-4f3a-42c4-b5a0-cbff2a2e0862"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.592281 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.592319 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr5q4\" (UniqueName: \"kubernetes.io/projected/5617036b-4f3a-42c4-b5a0-cbff2a2e0862-kube-api-access-mr5q4\") on node \"crc\" DevicePath \"\"" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.858364 4837 generic.go:334] "Generic (PLEG): container finished" podID="5617036b-4f3a-42c4-b5a0-cbff2a2e0862" containerID="497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c" exitCode=0 Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.858421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6jss" event={"ID":"5617036b-4f3a-42c4-b5a0-cbff2a2e0862","Type":"ContainerDied","Data":"497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c"} Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.858443 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6jss" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.858471 4837 scope.go:117] "RemoveContainer" containerID="497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.858458 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6jss" event={"ID":"5617036b-4f3a-42c4-b5a0-cbff2a2e0862","Type":"ContainerDied","Data":"aedc8acc53ab51a1729713269b267692ed94aa9efd01a7c2fd6799dd71e5856d"} Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.886213 4837 scope.go:117] "RemoveContainer" containerID="263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.925693 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6jss"] Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.930179 4837 scope.go:117] "RemoveContainer" containerID="d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.937298 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6jss"] Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.956056 4837 scope.go:117] "RemoveContainer" containerID="497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c" Oct 14 14:08:23 crc kubenswrapper[4837]: E1014 14:08:23.956596 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c\": container with ID starting with 497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c not found: ID does not exist" containerID="497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.956628 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c"} err="failed to get container status \"497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c\": rpc error: code = NotFound desc = could not find container \"497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c\": container with ID starting with 497199be9c130da839af9d122d65470d7903db13e2ba8baec96bd2bfe31b840c not found: ID does not exist" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.956649 4837 scope.go:117] "RemoveContainer" containerID="263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0" Oct 14 14:08:23 crc kubenswrapper[4837]: E1014 14:08:23.957016 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0\": container with ID starting with 263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0 not found: ID does not exist" containerID="263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.957036 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0"} err="failed to get container status \"263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0\": rpc error: code = NotFound desc = could not find container \"263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0\": container with ID starting with 263acedd8cfd97d8b72af71b4d0c4a0dcc9abdc57aa04f435aa4a4c6d54827e0 not found: ID does not exist" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.957049 4837 scope.go:117] "RemoveContainer" containerID="d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586" Oct 14 14:08:23 crc kubenswrapper[4837]: E1014 14:08:23.957405 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586\": container with ID starting with d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586 not found: ID does not exist" containerID="d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586" Oct 14 14:08:23 crc kubenswrapper[4837]: I1014 14:08:23.957433 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586"} err="failed to get container status \"d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586\": rpc error: code = NotFound desc = could not find container \"d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586\": container with ID starting with d16069440ae86c092531fa9cbcd99f972e5eef93aac5b0206b0a46f818e65586 not found: ID does not exist" Oct 14 14:08:24 crc kubenswrapper[4837]: I1014 14:08:24.794007 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5617036b-4f3a-42c4-b5a0-cbff2a2e0862" path="/var/lib/kubelet/pods/5617036b-4f3a-42c4-b5a0-cbff2a2e0862/volumes" Oct 14 14:08:41 crc kubenswrapper[4837]: I1014 14:08:41.141624 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 14:08:41 crc kubenswrapper[4837]: I1014 14:08:41.142206 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.140106 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.140746 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.140801 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.141653 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d41c0ab4b0c247f216128b509a252b01626127621aa5db49ecb9deb60bb0056"} pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.141717 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" containerID="cri-o://7d41c0ab4b0c247f216128b509a252b01626127621aa5db49ecb9deb60bb0056" gracePeriod=600 Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.289268 4837 generic.go:334] "Generic (PLEG): container finished" podID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerID="7d41c0ab4b0c247f216128b509a252b01626127621aa5db49ecb9deb60bb0056" exitCode=0 Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.289350 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerDied","Data":"7d41c0ab4b0c247f216128b509a252b01626127621aa5db49ecb9deb60bb0056"} Oct 14 14:09:11 crc kubenswrapper[4837]: I1014 14:09:11.289619 4837 scope.go:117] "RemoveContainer" containerID="976619aaa14e22d3b65a21a26d0cf3b47d82d844a4849c68bee45670ac2a5dcb" Oct 14 14:09:12 crc kubenswrapper[4837]: I1014 14:09:12.303654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" event={"ID":"d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3","Type":"ContainerStarted","Data":"bfdf3cff098e12a31c4ccd1d93635fec143ee288ef2fa8903e9bb0b1825e3c19"} Oct 14 14:09:41 crc kubenswrapper[4837]: I1014 14:09:41.613828 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7de80d5-614f-47a2-a149-c036f5e5c1c8" containerID="ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71" exitCode=0 Oct 14 14:09:41 crc kubenswrapper[4837]: I1014 14:09:41.613923 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9bw42/must-gather-s6f72" event={"ID":"a7de80d5-614f-47a2-a149-c036f5e5c1c8","Type":"ContainerDied","Data":"ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71"} Oct 14 14:09:41 crc kubenswrapper[4837]: I1014 14:09:41.615756 4837 scope.go:117] "RemoveContainer" containerID="ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71" Oct 14 14:09:41 crc kubenswrapper[4837]: I1014 14:09:41.894724 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9bw42_must-gather-s6f72_a7de80d5-614f-47a2-a149-c036f5e5c1c8/gather/0.log" Oct 14 14:09:51 crc kubenswrapper[4837]: I1014 14:09:51.561219 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9bw42/must-gather-s6f72"] Oct 14 14:09:51 crc kubenswrapper[4837]: I1014 14:09:51.561908 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9bw42/must-gather-s6f72" podUID="a7de80d5-614f-47a2-a149-c036f5e5c1c8" containerName="copy" containerID="cri-o://ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb" gracePeriod=2 Oct 14 14:09:51 crc kubenswrapper[4837]: I1014 14:09:51.568542 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9bw42/must-gather-s6f72"] Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.040977 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9bw42_must-gather-s6f72_a7de80d5-614f-47a2-a149-c036f5e5c1c8/copy/0.log" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.042067 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.169307 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrgwx\" (UniqueName: \"kubernetes.io/projected/a7de80d5-614f-47a2-a149-c036f5e5c1c8-kube-api-access-jrgwx\") pod \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.169400 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7de80d5-614f-47a2-a149-c036f5e5c1c8-must-gather-output\") pod \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\" (UID: \"a7de80d5-614f-47a2-a149-c036f5e5c1c8\") " Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.176384 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7de80d5-614f-47a2-a149-c036f5e5c1c8-kube-api-access-jrgwx" (OuterVolumeSpecName: "kube-api-access-jrgwx") pod "a7de80d5-614f-47a2-a149-c036f5e5c1c8" (UID: "a7de80d5-614f-47a2-a149-c036f5e5c1c8"). InnerVolumeSpecName "kube-api-access-jrgwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.271545 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrgwx\" (UniqueName: \"kubernetes.io/projected/a7de80d5-614f-47a2-a149-c036f5e5c1c8-kube-api-access-jrgwx\") on node \"crc\" DevicePath \"\"" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.314076 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7de80d5-614f-47a2-a149-c036f5e5c1c8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a7de80d5-614f-47a2-a149-c036f5e5c1c8" (UID: "a7de80d5-614f-47a2-a149-c036f5e5c1c8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.373587 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7de80d5-614f-47a2-a149-c036f5e5c1c8-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.715336 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9bw42_must-gather-s6f72_a7de80d5-614f-47a2-a149-c036f5e5c1c8/copy/0.log" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.716008 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7de80d5-614f-47a2-a149-c036f5e5c1c8" containerID="ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb" exitCode=143 Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.716063 4837 scope.go:117] "RemoveContainer" containerID="ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.716116 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9bw42/must-gather-s6f72" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.755348 4837 scope.go:117] "RemoveContainer" containerID="ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.802627 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7de80d5-614f-47a2-a149-c036f5e5c1c8" path="/var/lib/kubelet/pods/a7de80d5-614f-47a2-a149-c036f5e5c1c8/volumes" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.853757 4837 scope.go:117] "RemoveContainer" containerID="ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb" Oct 14 14:09:52 crc kubenswrapper[4837]: E1014 14:09:52.854257 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb\": container with ID starting with ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb not found: ID does not exist" containerID="ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.854302 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb"} err="failed to get container status \"ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb\": rpc error: code = NotFound desc = could not find container \"ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb\": container with ID starting with ffe246cd9ddc2e119a2ae58ec3c9d8a5a3e8677c60ab27df060e52b25cd3a6cb not found: ID does not exist" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.854323 4837 scope.go:117] "RemoveContainer" containerID="ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71" Oct 14 14:09:52 crc kubenswrapper[4837]: E1014 14:09:52.854730 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71\": container with ID starting with ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71 not found: ID does not exist" containerID="ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71" Oct 14 14:09:52 crc kubenswrapper[4837]: I1014 14:09:52.854750 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71"} err="failed to get container status \"ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71\": rpc error: code = NotFound desc = could not find container \"ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71\": container with ID starting with ea0af7a3cbe88978baba72017a5d305dd3d713097c63b38ba25108f42e479d71 not found: ID does not exist" Oct 14 14:11:11 crc kubenswrapper[4837]: I1014 14:11:11.140253 4837 patch_prober.go:28] interesting pod/machine-config-daemon-h4ggd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 14 14:11:11 crc kubenswrapper[4837]: I1014 14:11:11.140901 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h4ggd" podUID="d7ba7fa6-d0a5-4e80-a6e4-33d7ce2081d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"